AXA Chairs reward only a few scientists every year. With his chair on New Computational Approaches to Risk Modeling, Maurizio Filippone a researcher at Eurecom joins a community of prestigious researchers such as Jean Tirole, French Professor who won the Nobel prize in economics.
Maurizio, you’ve just been awarded an AXA chair. Could you explain what it is about and what made your project selected?
AXA chairs are funded by the AXA Research Fund, which supports fundamental research to advance our understanding of risk. Started in 2008, the AXA Chair scheme funds about 50 new projects annually, of which four to eight are chairs. They are individual fellowships, and the one I received is going to support my research activities for the next seven years. My project is entitled “New Computational Approaches to Risk Modeling”. The AXA Chair selection process is not based on the project only. For this type of grant, several criteria are important: timeliness, vision, credibility of both the proposal and the candidate (track record, collaborations, etc.), institution and fit within institution’s strategy. For example, the fact that the research area of this topic is in line with the Eurecom long-term strategy in Data science played a major role in the selection process of my project. This grant definitely represents a major achievement in my career.
What is your project about exactly?
My project deals with one simple question: How do you go from data to decisions? Today, we can access so much data generated by so many sensors, but we are facing difficulties in using these data in a sensible way. Machine learning is the main technique that helps make sense of data and I will use and develop novel techniques in this domain throughout this project. Quantification of risk and decision-making require accurate quantification of uncertainty, which is a major challenge in many areas of sciences involving complex phenomena like finance, environmental and medical sciences. In order to accurately quantify the level of uncertainty, we employ flexible and accurate tools offered by probabilistic nonparametric statistical models. But today’s diversity and abundance of data make it difficult to use these models. The goal of my project is to propose new ways to better manage the interface between computational and statistical models – which in turn will help get accurate confidence on predictions based on observed data.
How will you be able to do that? With what kind of advanced computing techniques?
The idea behind the project is that it is possible to carry out exact quantification of uncertainty relying exclusively on approximate, and therefore cheaper, computations. Using nonparametric models is difficult and generally computationally intractable due to the complexity of the systems and amount of data. Although computers are more and more powerful, exact computations remain serial, too long, too expensive and sometimes almost impossible to carry out. The way approximate computations will be designed in this project will be able to reduce computing time by orders of magnitude! The exploitation of parallel and distributed computing on large scale computing facilities – which is a huge expertise at Eurecom – will be key to achieve this. We will thus be able to develop new computer models that will make accurate quantification of uncertainty possible.
What are the practical applications?
Part of the focus of the project will be on life and environmental applications that require quantification of risk. We will then use mostly life sciences data (e.g., neuroimaging and genomics) and environmental data for our models. I am confident that this project will help tackle the explosion of large scale and diverse data in life and environmental sciences. This is already a huge challenge today, and it will be even more difficult to deal with in the future. In the mid-term, we will develop practical and scalable algorithms that learn from data and accurately quantify their uncertainty on predictions. On the long term, we will be able to improve on current approaches for risk estimation: they will be timely and more accurate. These approaches can have major implications in the development of medical treatment strategies or environmental policies for example. Is some seismic activity going to trigger a tsunami for which it is worth warning the population or not? Is a person showing signs of a systemic disease, like Parkinson, actually going develop the disease or not? I hope the results of our project will make it easier to answer these questions.
Do you have any partnerships in this project?
Of course! I will initiate some new collaborations and continue collaborating with several prestigious institutions worldwide to make this project a success: Columbia University in NYC, Oxford, Cambridge, UCL and Glasgow in the UK, the Donders Institute of Neuroscience in the Netherlands, New South Wales in Australia, as well as INRIA in France. The funding from the AXA Research Fund will help create a research team at Eurecom: the team will comprise myself and two PhD students and one Post Doc. I would like the team to comprise a blend of expertise, since novelty requires an interdisciplinary approach: computing, statistics, mathematics, physics, plus some expertise in life and environmental sciences.
What are the main challenges you will be facing in this project?
Attracting talents is one of the main challenges! I’ve been lucky so far, but it is generally difficult. This project is extremely ambitious; it is a high-risk, high-gain project, so there are some difficult technical challenges to face – all of them related to the cutting-edge tools, techniques and strategies we will be using and developing. We will find ourselves in the usual situation when working on something new and visionary – namely, being stuck in blind alleys or being forced to dismiss promising ideas that do not work to give some examples. But that is why it has been funded for 7-years! Despite these difficulties, I am confident this project will be a success and that we will make a huge impact.