AlgoriDAM ANR project, selected by AAPG2019. CE48: Fondements du numérique : informatique, traitement du signal et automatique.
Algorithms are everywhere, in every piece of technology, as well as in many decision-making processes. The traditional computation model, with its clean sequence of input-algorithm-output, no longer fits most applications, as we are faced nowadays with new challenges, e.g., when the input is massive, scattered or disorganized, plagued with errors, or constantly evolving. Yet, algorithms are still required "to work", and to produce useful results. Theory has fallen behind practice: our understanding of how to tackle the above mentioned challenges is lagging behind, and it is urgent to lay down sounder theoretical foundations for the types of algorithms that are now "alive" and to better understand how to render them resilient. As theoreticians, we address such challenges with an axiomatic perspective grounded in theoretical models and rigorous theorems. Thus, in order to develop the theory of algorithms for massive, erroneous and dynamic data, we plan on focusing on three settings:
- Massive data (e.g., the data is too large to fit into memory);
- Noisy data (e.g., the data cannot be reliably accessed and hence observed with error, or noise);
- Dynamic data (e.g., the data evolves constantly).
Scientific work starting date: January 1, 2020.
Four slides presenting the project at the November 6, 2019 meeting at ANR.