Algorithmic theory of new data models

AlgoriDAM ANR project, selected by AAPG2019. CE48: Fondements du numérique : informatique, traitement du signal et automatique.

(Important Event: we will organise a Dagstuhl-style 4-day workshop in the heart of Fontainebleau in February 2024. See here).

Algorithms are everywhere, in every piece of technology, as well as in many decision-making processes. The traditional computation model, with its clean sequence of input-algorithm-output, no longer fits most applications, as we are faced nowadays with new challenges, e.g., when the input is massive, scattered or disorganized, plagued with errors, or constantly evolving. Yet, algorithms are still required "to work", and to produce useful results. Theory has fallen behind practice: our understanding of how to tackle the above mentioned challenges is lagging behind, and it is urgent to lay down sounder theoretical foundations for the types of algorithms that are now "alive" and to better understand how to render them resilient. As theoreticians, we address such challenges with an axiomatic perspective grounded in theoretical models and rigorous theorems. Thus, in order to develop the theory of algorithms for massive, erroneous and dynamic data, we plan on focusing on three settings:

  1. Massive data (e.g., the data is too large to fit into memory);
  2. Noisy data (e.g., the data cannot be reliably accessed and hence observed with error, or noise);
  3. Dynamic data (e.g., the data evolves constantly).

Scientific work starting date: January 1, 2020.
Scientific work ending date: December 31, 2024.

Four slides presenting the project at the November 6, 2019 meeting at ANR.

Link to the preproposal.

Algoridam workshop, February 12-15, 2024, Fontainebleau.