Séminaire

Équipe-projet INRIA $\pi r^2$
Équipe thématique Algèbre et calcul
Équipe thématique Analyse et conception de systèmes
Équipe thématique Preuves et programmes


Jour, heure et lieu

Le jeudi à 10h30, salle 3052

Le calendrier des séances (format iCal).
Pour ajouter le calendrier des séances à votre agenda favori, souscrire au calendrier en indiquant ce lien.


Contact(s)


Prochaines séances


Preuves, programmes et systèmes
Jeudi 12 décembre 2019, 10 heures 30, École normale supérieure de Lyon
Amina Doumane, Cristina Matache Séminaire Chocola

Preuves, programmes et systèmes
Jeudi 9 janvier 2020, 10 heures 30, Salle 3052
Joseph Tassarotti (Boston College) Non encore annoncé.

Preuves, programmes et systèmes
Jeudi 16 janvier 2020, 10 heures 30, Salle 3052
Gabriel Radanne (University of Freiburg) Kindly Bent to Free Us

Systems programming often requires the manipulation of resources like file handles, network connections, or dynamically allocated memory. Programmers need to follow certain protocols to handle these resources correctly. Violating these protocols causes bugs ranging from type mismatches over data races to use-after-free errors and memory leaks. These bugs often lead to security vulnerabilities.

While statically typed programming languages guarantee type soundness and memory safety by design, most of them do not address issues arising from improper resource handling. Linear and affine types guarantee single-threaded resource usage, but they are rarely deployed as they are too restrictive for real-world applications.

We present Affe, an extension of ML with constrained types that manages linearity and affinity properties through kinds. In addition Affe supports the exclusive and unrestricted borrowing of affine resources, inspired by features of Rust. Moreover, Affe retains the defining features of the ML family: an impure, strict functional expression language with complete principal type inference and type abstraction through modules. Our language does not require any linearity annotations in expressions and supports common functional programming idioms.

Draft: https://arxiv.org/abs/1908.09681

Preuves, programmes et systèmes
Jeudi 23 janvier 2020, 10 heures 30, Salle 3052
Robert Atkey (University of Strathclyde) Non encore annoncé.

Preuves, programmes et systèmes
Jeudi 30 janvier 2020, 10 heures 30, Salle 3052
Sam Van Gool Model completeness in logical algebra

Very short abstract: What is an existentially closed Heyting algebra and what does it have to do with automata theory?

Slightly longer abstract: Logical systems of deduction often resemble algebraic systems of equation resolution. The simplest instance of this resemblance is the fact that classical propositional logic essentially boils down to studying algebras over the two-element field. When one changes the logical deduction system, the algebraic structures become less simple, and more interesting; this is where one enters the world of Heyting algebras, modal algebras, and generalizations of such.

The aim of our work here is to gain a better understanding of such logical-algebraic structures by studying them from the perspective of model theory. In the model-theoretic study of usual algebra, the concept of model completeness plays a central role: it provides the correct abstraction of the concept of an algebraically closed field. We show that model completeness also has an important role to play in logical algebra.

In particular, we will discuss two cases of model completeness: intuitionistic logic, and linear temporal logic. In the former, model completeness can be seen to be closely related to a certain interpolation property of the logic, originally established by Pitts. In the latter, automata on infinite words are the technical ingredient that leads to model completeness.

Preuves, programmes et systèmes
Jeudi 13 février 2020, 10 heures 30, Salle 3052
Cyrille Chenavier (Inria Lille) Topological rewriting systems applied to standard bases and syntactic algebras

On introduit les systèmes de réécriture topologiques comme généralisation des systèmes de réécriture abstraits, où l'on considère un espace topologique au lieu d'un ensemble de termes. Les systèmes de réécriture abstraits sont les systèmes de réécriture topologiques pour la topologie discrète. On introduit la confluence topologique comme étant une propriété de confluence par passage à la limite, et on caractérise les bases standards par cette propriété. On caractérise également la confluence topologique par des opérations de treillis grâce à une représentation des systèmes de réécriture par des opérateurs de réduction continus. Enfin, on relie les représentations des séries formelles non commutatives à la dualité des opérateurs de réduction, et on déduit un critère pour qu'une algèbre soit syntaxique.


Séances passées



Année 2019

Preuves, programmes et systèmes
Jeudi 28 novembre 2019, 10 heures 30, Salle 3052
Valeria Vignudelli (École normale supérieure de Lyon) Bisimulations and trace equivalences for nondeterministic probabilistic systems

We study trace-based equivalences for labelled transition systems combining nondeterministic and probabilistic choices. We do so via a coalgebraic construction known as the generalized powerset construction, which consists in first determinizing a system and then recovering trace equivalence as bisimulation equivalence on the determinized system. The generalized powerset construction allows us to apply these two steps, inspired by the standard powerset construction for nondeterministic automata, to a variety of systems, such as labelled transition systems with different computational effects captured by monads (e.g., systems with probabilistic choices). We show how trace semantics for labelled transition systems combining nondeterministic and probabilistic choices can be recovered by instantiating the generalized powerset construction, and we characterise and compare the resulting semantics to known definitions of trace equivalences appearing in the literature. Most of our results are based on the exciting interplay between monads and their presentations via algebraic theories.

Preuves, programmes et systèmes
Jeudi 21 novembre 2019, 10 heures 30, Salle 3052
Giulio Manzonetto (LIPN and Université Paris Nord) About the power of Taylor expansion

The speculative ambition of replacing the old theory of program approximation based on syntactic continuity with the theory of resource consumption based on Taylor expansion and originating from the differential lambda-calculus is nowadays at hand. Using this resource sensitive theory, we provide simple proofs of important results in lambda-calculus that are usually demonstrated by exploiting Scott's continuity, Berry's stability or Kahn and Plotkin's sequentiality theory.

A paradigmatic example is given by the Perpendicular Lines Lemma for the Böhm tree semantics, which is proved here simply by induction, but relying on the main properties of resource approximants: strong normalization, confluence and linearity.

Preuves, programmes et systèmes
Jeudi 14 novembre 2019, 10 heures 30, École normale supérieure de Lyon
Francesco Gavazzo, Marie Kerjean, Yann Régis-Gianas Séminaire Chocola

Preuves, programmes et systèmes
Jeudi 17 octobre 2019, 10 heures 30, École normale supérieure de Lyon
Eric Finster Séminaire Chocola

Preuves, programmes et systèmes
Jeudi 3 octobre 2019, 10 heures 30, Salle 3052
Nicolas Behr (IRIF) Stochastic dynamics of graph-like structures

In this talk, I will explain the fundamental ideas of stochastic mechanics which led me to investigate the dynamics of graph-like structures, using a new perspective on rewriting systems. My talk is designed to be accessible to a wide audience, and I will thus illustrate my approach on a very simple voter model, expressed in the language of stochastic rewriting. The discussion of this elementary example will clarify the reasons why I am currently working on a general theory of tracelets, mixing ideas from rewriting theory, category theory and concurrency theory, in order to extend the traditional realm of stochastic mechanics, and to implement analysis tools for sophisticated and currently intractable stochastic phenomena on graph-like structures.

Preuves, programmes et systèmes
Jeudi 26 septembre 2019, 10 heures 30, École normale supérieure de Lyon
Pierre Clairambault, Andrea Condoluci, Hugo Férée Séminaire Chocola

Preuves, programmes et systèmes
Lundi 2 septembre 2019, 9 heures 30, Amphithéâtre Turing
Divers Orateurs Journées PPS 2019

Rencontres de rentrée du pôle PPS de l'IRIF, du lundi 2 au mardi 3 septembre 2019. Accueil à 9h, exposés de 9h30 à 17h30.

Programme détaillé disponible sur la page https://www.irif.fr/rencontres/pps2019/index.

Preuves, programmes et systèmes
Jeudi 20 juin 2019, 10 heures 30, Salle 3052
Alex Simpson (University of Ljubljana) Sheaf principles for reasoning about probabilistic independence

Preuves, programmes et systèmes
Mercredi 19 juin 2019, 14 heures 30, Salle 3052
Ugo Dal Lago (INRIA and Univ. Bologna) The Geometry of Bayesian Programming

We give a geometry of interaction model for a typed lambda-calculus endowed with operators for sampling from a continuous uniform distribution and soft conditioning, namely a paradigmatic calculus for higher-order Bayesian programming. The model is based on the category of measurable spaces and partial measurable functions, and is proved adequate with respect to both a distribution-based and a sampling based operational semantics.

Joint work with Naohiko Hoshino

Preuves, programmes et systèmes
Lundi 17 juin 2019, 11 heures, Salle 3052
Pierre-Malo Deniélou (Google) From MapReduce to Apache Beam: A Journey in Abstraction

(This is a joint seminar between the CompSys, PPS, and Verification seminar series.)

Processing large amounts of data used to be an affair of specialists: specialized hardware, specialized software, specialized programming model, specialized engineers. MapReduce was the first widely adopted high-level API for large-scale data processing. It helped democratize big data processing by providing a clear abstraction that was supported by several efficient systems. In this talk, I will present how the programming APIs (and underlying systems) for large-scale data processing evolved in the past 20 years, both within Google and in the open source world. I will start from MapReduce and Millwheel and finish with Apache Beam and where we're headed next.

Preuves, programmes et systèmes
Jeudi 13 juin 2019, 10 heures 30, Salle 3052
Dan Ghica (University of Birmingham, UK) The next 700 abstract machines

We propose a new core calculus for programming languages with effects, interpreted using a hypergraph-rewriting abstract machine inspired by the Geometry of Interaction. The intrinsic calculus syntax and semantics only deals with the basic structural aspects of programming languages: variable binding, name binding, and thunking. Everything else, including function abstraction and application, must be provided as extrinsic operations with associated rewrite rules. The graph representation yields natural concepts of locality and robustness for equational properties and reduction rules, which enable a novel flexible and powerful reasoning methodology about (type-free) languages with effects. We illustrate and motivate the technique with challenging examples from the literature.

Joint work with Koko Muroya and Todd Waugh Ambridge.

Preuves, programmes et systèmes
Mardi 11 juin 2019, 11 heures, Salle 3052
Marco Gaboardi (Buffalo University, USA) Differential Privacy: Formal Verification and Applications

A vast amount of individuals’ data is collected, stored and accessed every day. These data are valuable for scientific and medical research, for decision making, etc. However, use or release of these data may be restricted by concerns for the privacy of the individuals contributing them. Differential Privacy has been conceived to offer ways to answer statistical queries about sensitive data while providing strong provable privacy guarantees ensuring that the presence or absence of a single individual in the data has a negligible statistical effect on the query's result. In this talk I will introduce differential privacy and present some formal verification techniques we developed to help programmers to certify their programs differentially private and to guarantee that their programs provide accurate answers. These techniques combine approaches based on type systems and program logics with ideas for reasoning about differential privacy using composition, sensitivity and probabilistic coupling. This combination permits fine-grained formal analyses of several basic mechanisms that are fundamental for designing practical differential privacy applications. In addition, I will present some of our results showing how to answer a large number of queries on high dimensional datasets preserving privacy, and how to perform differentially private chi-squared hypothesis testing with the same asymptotic guarantees as the traditional tests.

Preuves, programmes et systèmes
Jeudi 6 juin 2019, 10 heures 30, Salle 3052
Jean Goubault-Larrecq (ENS Cachan) A Probabilistic and Non-Deterministic Call-by-Push-Value Language

There is no known way of giving a domain-theoretic semantics to higher-order probabilistic languages, in such a way that the involved domains are continuous or quasi-continuous - the latter is required to do any serious mathematics. We argue that the problem naturally disappears for languages with two kinds of types, where one kind is interpreted in a Cartesian-closed category of continuous dcpos, and the other is interpreted in a category that is closed under the probabilistic powerdomain functor. Such a setting is provided by Paul B. Levy's call-by-push-value paradigm. Following this insight, we define a call-by-push-value language, with probabilistic choice sitting inside the value types, and where conversion from a value type to a computation type involves demonic non-determinism. We give both a domain-theoretic semantics and an operational semantics for the resulting language, and we show that they are sound and adequate. With the addition of statistical termination testers and parallel if, we show that the language is even fully abstract - and those two primitives are required for that.

Preuves, programmes et systèmes
Jeudi 16 mai 2019, 10 heures 30, Salle 3052
Lionel Vaux (Amu, Marseille) An application of parallel cut elimination in multiplicative linear logic to the Taylor expansion of proof nets

We examine some combinatorial properties of parallel cut elimination in multiplicative linear logic (MLL) proof nets. We show that, provided we impose a constraint on some paths, we can bound the size of all the nets satisfying this constraint and reducing to a fixed resultant net. This result gives a sufficient condition for an infinite weighted sum of nets to reduce into another sum of nets, while keeping coefficients finite. We moreover show that our constraints are stable under reduction.

Our approach is motivated by the quantitative semantics of linear logic: many models have been proposed, whose structure reflect the Taylor expansion of multiplicative expo- nential linear logic (MELL) proof nets into infinite sums of differential nets. In order to simulate one cut elimination step in MELL, it is necessary to reduce an arbitrary number of cuts in the differential nets of its Taylor expansion. It turns out our results apply to differential nets, because their cut elimination is essentially multiplicative. We moreover show that the set of differential nets that occur in the Taylor expansion of an MELL net automatically satisfies our constraints.

(Joint work with Jules Chouquet.)

Preuves, programmes et systèmes
Jeudi 2 mai 2019, 10 heures 30, Salle 3052
Hugo Férée (University of Kent) Une théorie de la complexité d'ordre supérieur via la sémantique des jeux

Alors que de nombreux modèles de calcul nous permettent de manipuler des données issues de domaines non dénombrables (fonctions d'ordre supérieur, nombres réels, objets définis par co-induction), les différentes notions de complexité ne permettent de s'intéresser qu'à des fonctions d'ordre 1 (i.e. traitant des données finies) voire des fonctions d'ordre 2 (i.e. traitant des fonctions d'ordre 1). Dû à la difficulté de définir une notion pertinente de taille pour des données d'ordre quelconque, les notions de complexité pour des fonctions d'ordre 3 ou plus sont à la fois incomplètes et imparfaites.

En nous appuyant sur la sémantique des jeux nous proposons ici une définition de taille et de complexité pour les fonctions d'ordre supérieur de PCF, et plus généralement pour tout processus calculatoire assimilable à une certaine classe de jeux séquentiels.

Preuves, programmes et systèmes
Mardi 2 avril 2019, 10 heures 30, Salle 3052
Jérémy Dubut (NII, Tokyo) Categorical approaches to bisimilarity

There are different categorical approaches to variations of transition systems and their bisimulations. One is coalgebras, another one is open maps. In this talk, I will describe these two approaches, illustrated by the case of labelled transition systems (almost no knowledge in category theory is needed for this part). I will then describe how it is possible to translate one into the other in some cases. From open maps to coalgebras, this was done by Lasota, using multi-sorted transition systems. From coalgebras to open maps, this was done in my joint work with Thorsten Wißmann, Shin-ya Katsumata and Ichiro Hasuo, where we derived path-categories and trace semantics for free for different flavors of categories of coalgebras with non-deterministic branching. I will illustrate those constructions on various concrete examples (tree automata, regular nominal automata, …).

Preuves, programmes et systèmes
Jeudi 28 mars 2019, 10 heures 30, Salle 3052
Thomas Leventis (Univ. Bologna (Italy)) Taylor expansion of probabilistic lambda-terms

Taylor expansions have been introduced by Ehrhard and Regnier as a computational interpretation of Girard's quantitative semantics. Lambda-terms are expanded into linear combinations of their n-linear approximants, such approximants being defined as resource lambda-terms. This construction is well-suited to interpret quantitative calculi, such as the probabilistic lambda-calculus: Taylor expansions are linear combinations and the resource lambda-calculus is linear so interpreting probability distributions over terms is straightforward. Yet paradoxically the proof technique used by Ehrhard and Regnier to study Taylor expansions heavily rely on the particular structure of the expansions of ordinary lambda terms, and they are not suited to work on a quantitative setting. In this talk we will show how to indirectly extend their result to probabilistic Taylor expansions. First we will introduce explicit Taylor expansions, which uses “inert” probabilistic choices to interpret probabilistic terms while preserving the necessary properties to apply Ehrhard and Regnier's proof techniques. Then we will show how to extract interesting results on probabilistic Taylor expansions from these explicit Taylor expansions.

Preuves, programmes et systèmes
Jeudi 21 mars 2019, 10 heures 30, Salle 3052
Paolo Pistone (Univ. Tubingen) Quelques résultats sur l'équivalence des preuves dans la logique linéaire du second ordre

Les réseaux de preuves donnent une représentation canonique des démonstrations dans la logique linéaire multiplicative : l'équivalence dénotationnelle des preuves coïncide avec l'équivalence des réseaux. Par contre, la canonicité des réseaux ne s'étend pas au second ordre. Ce problème est du à la présence des témoins des règles existentielles. En fait, dans plusieurs sémantiques dénotationnelles l'information portée par les preuves est compressée, et notamment les témoins existentiels sont effacés.

Font partie de ces sémantiques “effaçantes” les sémantiques cohérente et rélationnelle, ainsi que la sémantique dinaturelle et la sémantique observationnelle. On présente des résultats sur l'équivalence des preuves dans ces sémantiques et on discute des applications en complexité dues à la possibilité de compresser les preuves. Une partie de ces travaux est issue d'une collaboration avec L. Tortora de Falco, T. Seiller et L.T.D. Nguyễn.

Preuves, programmes et systèmes
Jeudi 28 février 2019, 10 heures 30, Salle 3052
François Bergeron (UQAM) Theory of species of structures and applications

Species of Structures, introduced almost 40 years ago, give a natural conceptu l framework for the notion of ``combinatorial construction on sets’’. Fundamental operations between species allow the specification of new species in terms of known ones, giving a systematic context for the solution of classical enumeration questions, including both labelled and unlabelled enumeration (a.k.a. Pólya Theory). In this talk, we will give an accessible introduction to the Theory of Species, with many illustrations. If time allows, we will also survey some of many areas (computer science, theoretical physics, chemistry, etc.) where they are currently useful.

Preuves, programmes et systèmes
Mardi 26 février 2019, 10 heures 30, Salle 3052
Eric Finster (Inria - Nantes) Higher Universal Algebra in Dependent Type Theory

The naive translation of set-theoretic definitions of algebraic structures (such as monoids, categories and groups) into Martin-Lof type theory is often incomplete: without additional hypotheses such as truncation or a decidable equality, these translations fail to specify the behavior of the structure with respect to higher equalities. On the other hand, a complete description of such structures is quite difficult, as it typically requires the specification of an infinite number of equations. I will present recent progress on this problem by giving a definition of a “coherent polynomial monad” internal to type theory.

Preuves, programmes et systèmes
Jeudi 21 février 2019, 10 heures 30, Salle 3052
Damiano Mazza (CNRS) Intersection Types and Runtime Errors in the Pi-Calculus

We introduce a type system for the $\pi$-calculus which is designed to guarantee that typable processes are well-behaved, namely they never produce a run-time error and, even if they may diverge, there is always a chance for them to “finish their work”, i.e. to reduce to an idle process. The introduced type system is based on non-idempotent intersections, and is thus very powerful as for the class of processes it can capture. Indeed, despite the fact that the underlying property is $\Pi^0_2$-complete, there is a way to show that the system is complete, i.e., that any well-behaved process is typable, although for obvious reasons infinitely many derivations need to be considered.

Joint work with Ugo Dal Lago, Marc De Visme, Akira Yoshimizu

Preuves, programmes et systèmes
Jeudi 7 février 2019, 10 heures 30, Salle 3052
Sandra Alves Linearisation of the lambda-calculus and its termination

The notion of linearisation of the lambda-calculus has been explored in different settings: Damas and Florido used information given by intersection types, to define a notion of expansion of terms in the lambda-calculus into linear terms; Kfoury embedded the lambda-calculus into a new linear calculus, with a new notion of “linear” reduction, and linearization was defined indirectly by means of a notion of contraction of expanded terms in the new calculus into standard lambda-terms; Alves and Florido defined a notion of linearisation from standard lambda-terms into a linear subset, called the weak linear lambda-calculus, by using the notion of computation as paths, deriving from Lévy’s labelled lambda-calculus.

In this talk we will explore these previous works, discuss their relation and present some open problems regarding the termination of linearisation methods.

Preuves, programmes et systèmes
Jeudi 24 janvier 2019, 10 heures 30, Ens Lyon
Seminaire Chocola (Ens Lyon) Non encore annoncé.

Preuves, programmes et systèmes
Jeudi 10 janvier 2019, 10 heures 30, Salle 3052
Aurore Alcolei (Ens Lyon) Concurrent strategies for Herbrand's theorem

Herbrand's theorem exposes some of the constructive content of classical logic. In its simplest form, it reduces the validity of a first-order purely existential formula to that of a finite disjunction. More generally, it reduces first-order validity to propositional validity, by understanding the structure of the assignment of first-order terms to existential quantifiers, and the causal dependency between quantifiers.

In this talk, we show that Herbrand's theorem in its general form can be elegantly stated and proved as a theorem in the framework of concurrent games, a denotational semantics designed to faithfully represent causality and independence in concurrent systems. Closely related to expansion trees, the causal structure of concurrent strategies, paired with annotations by first-order terms, is used to specify the dependency between quantifiers. As furthermore these strategies can be composed we are able to interpret classical sequent proofs, yielding a compositional proof of Herbrand's theorem.


Année 2018

Preuves, programmes et systèmes
Jeudi 13 décembre 2018, 10 heures 30, ENS Lyon
Journée Chocola ENS Lyon

Preuves, programmes et systèmes
Jeudi 6 décembre 2018, 10 heures 30, Salle 3052
Luc Pellissier (IRIF) Linear Implicative Algebras, towards a Brouwer-Heyting-Kolmogorov interpretation of linear logic

Implicative Algebras were recently introduced as a unified framework for forcing and realisability, whose particularity is to interpret terms and formulæ uniformly.

In this ongoing work with Alexandre Miquel, we show how linear logic fits in this picture: we present a family of models, where certain computational principles are distinguished, and argue thata it provides a satisfactory meaning explanation of the connectives of (perfective) linear logic.

Preuves, programmes et systèmes
Jeudi 22 novembre 2018, 10 heures 30, Salle 3052
Stéphane Graham-Lengrand (LIX, CNRS) The intuitionistic calculus that was discovered 6 times

In this talk I will review G4ip, a simple calculus for intuitionistic propositional logic (IPL) that provides a decision procedure for provability. Its basic mechanisms can be traced back to Vorob'ev in the 50s, and were found again by Hudelmaier (88), Dyckhoff (90), Paulson (91), and (with a linear logic flavour) Lincoln-Scedrov-Shankar (91). In 2015, Claessen and Rosén presented the fastest decision procedure for IPL, based on SMT-solving techniques. We describe their algorithm and show how it can be seen as a variant of the aforementioned calculus, albeit with key variations that provide increased performance. Their algorithm relies on a SAT-solver used as a black box and treats intuitionistic entailment as a theory. We show how the recent framework of model-constructing satisfiability for SMT-solving could further integrate intuitionistic reasoning within the main SAT-solving loop. This would constitute an intuitionistic variant of the main algorithm of SAT-solvers, where Kripke models are built instead of Boolean models.

Preuves, programmes et systèmes
Jeudi 15 novembre 2018, 11 heures, Salle 3052 + Amphi Turing + 1016
Eric Tanter, Flavien Breuvart, Shane Mansfield & Xavier Leroy (Chocola meeting in Paris) Salidou's day

11:00 – 12:30 Éric Tanter (Universidad de Chile & Inria Paris) - Invited talk: Abstracting Gradual Typing: Principles and Application to Gradual Parametricity
  Gradual typing enables programming languages to seamlessly combine dynamic and static checking. Language researchers and designers have extended a wide variety of type systems to support gradual typing. These efforts consistently demonstrate that designing a satisfactory gradual counterpart to a static type system is challenging, and this challenge only increases with the sophistication of the type system. Gradual type system designers need more formal tools to help them conceptualize, structure, and evaluate their designs. Based on an understanding of gradual types as abstractions of static types, we have developed systematic foundations for gradual typing based on abstract interpretation, called Abstracting Gradual Typing (AGT for short). Gradual languages designed with the AGT approach satisfy, by construction, the established criteria for gradually-typed languages. In this talk, I will give a brief introduction to AGT in a simply-typed setting, and will then discuss recent work on a gradual counterpart of System F that enforces relational parametricity, even in the presence of imprecise types.

14:00 – 15:00 Flavien Breuvart (Université Paris 13) - Invited talk: Graded Types Parametricity : Principles and Application to Abstract Interpretation

  In this talk, we will briefly present the yet-to-start project CoGITARe, which aims at developing further graded types expressivity and inference in order to use them to perform abstract interpretations.
  In a second time, we give an historic on graded type systems and their expressivity. Then, we will see that this expressivity is limited by a lake of dependency/parametricity. We will thus explore two ongoing research directions that adds two very different kind of parametricity to graded types.
  (The provocative title should only be taken as a joke. Despite their apparent similitudes, this work and Éric Tanter's have vastly different objectives and involved different issues and techniques. In particular, notice that gradual and graded types are fundamentally different.)

15:30 – 16:30 Shane Mansfield (Univ. of Oxford) - Invited talk: An overview of empirical models

  Empirical models are a way of formalising data that arises in physical experiments. They were first proposed by Abramsky and Brandenburger as part of a framework to analyse fundamentally non-classical phenomena in quantum systems. Conveniently from a computer science perspective, they abstract away from the mathematical baggage of quantum theory and instead allow the key phenomena to be characterised purely as features of empirical data. After introducing the basic framework I will discuss some more recent results and developments, drawing on joint work with a number of collaborators. In particular: quantum computations can simply be modelled as classical computations with the additional ability to interact with a resource empirical model; quantitative measures of non-classicality can be shown to relate directly to some basic quantum-over-classical computational advantages; and the beginnings of a category-theoretic approach to reasoning about empirical models have emerged.

18:00 – 19:00 Xavier Leroy - Lecture: Leçon inaugurale au Collège de France

  This is not part of the CHoCoLa day, but you might be interested in attending! See https://www.college-de-france.fr/site/xavier-leroy/inaugural-lecture-2018-11-15-18h00.htm
  The meeting will take place in Bâtiment Sophie Germain, here.
  We will first gather at the 3rd floor of the building, for a breakfast.
  We will then move for the talk of the morning (11.00-12.30) to Amphi Turing.
  We will have lunch in room 3052
  In the afternoon (14h-16h30), we will be in room 1016

Preuves, programmes et systèmes
Vendredi 9 novembre 2018, 9 heures 30, Salle 3052
Pôle Preuves, Programmes Et Systèmes Journées 2018

Preuves, programmes et systèmes
Jeudi 8 novembre 2018, 9 heures 30, Salle 3052
Pôle Preuves, Programmes Et Systèmes Journées 2018

Preuves, programmes et systèmes
Jeudi 25 octobre 2018, 10 heures 30, Salle 3052
Carlo Spaccasassi (Microsoft Research Cambridge, Cambridge (United Kingdom)) Type-Based Analysis for Session Inference

We propose a type-based analysis to infer the session protocols of channels in an ML-like concurrent functional language. Combining and extending well-known techniques, we develop a type-checking system that separates the underlying ML type system from the typing of sessions. Without using linearity, our system guarantees communication safety and partial lock freedom. It also supports provably complete session inference with no programmer annotations. We exhibit the usefulness of our system with interesting examples, including one which is not typable in substructural type systems.

Preuves, programmes et systèmes
Jeudi 4 octobre 2018, 10 heures 30, Salle 3052
Adrien Guatto (IRIF) Towards A General Guarded Lambda-Calculus

Guarded recursion has emerged as a natural paradigm for programming with infinite data structures in type theory and high-assurance functional languages. In the first part of this talk, I will present some intuitions behind guarded recursion, using programming examples. In a second part, I will discuss ongoing work on a typed lambda-calculus equipped with rich facilities for defining and manipulating guarded recursive types.

Preuves, programmes et systèmes
Jeudi 27 septembre 2018, 10 heures 30, Salle 3052
Thomas Streicher (TU Darmstadt) Simplicial sets inside cubical sets

(joint work with J. Weinberger)

Abstract: Voevodsky's model of the Univalence Axion in simplicial sets is not constructive as shown by Coquand et al. To avoid this problem Cohen, Coquand, Huber, Moertberg have constructed in Cubical Sets a model of a Cubical Type Theory which has computational meaning.

We show that simplicial sets form a subtopos of cubical sets which allows one to contructively reason in the latter about the former.

Preuves, programmes et systèmes
Jeudi 5 juillet 2018, 10 heures 45, Salle 3052
Jeremy Dubut (National Institute of Informatics Tokyo, Japan) Higher-Dimensional Automata

I will give a new definition of partial Higher Dimension Automata – a geometric model for true concurrency – using lax functors. This definition is simpler than the original, and more natural from a categorical point of view, but also matches more clearly the intuition that pHDA are Higher Dimensional Automata with some missing faces. I will then focus on trees. Originally, for example in transition systems, trees are defined as those systems that have a unique path property. To understand what kind of unique path property is needed in pHDA, I will start by looking at trees as colimits of paths. From this, I will deduce that trees are exactly the pHDA with the unique path property modulo a notion of homotopy, and without any shortcuts. This property will allow me to prove two interesting characterisations of trees: trees are exactly those pHDA that are the unfolding of another pHDA; trees are exactly the cofibrant objects, much as in the language of Quillen’s model structures. In particular, this last characterisation gives the premisses of a new understanding of concurrency theory using homotopy theory.

Preuves, programmes et systèmes
Jeudi 28 juin 2018, 10 heures 30, Salle 3052
Olivier Hermant (CRI, Mines ParisTech) Intersection Types in Deduction Modulo Theory

In a 2012 paper, Richard Statman exhibited an inference system, based on second order monadic logic and non-terminating rewrite rules, that exactly types all strongly normalizable lambda-terms. We show that this system can be simplified to first-order minimal logic with rewrite rules, along the Deduction Modulo Theory lines.

We show that our rewrite system is terminating and that the conversion rule respects weak versions of invertibility of the arrow and of quantifiers. This requires additional care, in particular in the treatment of the latter. Then we study proof reduction, and show that every typable proof term is strongly normalizable and vice-versa.

Preuves, programmes et systèmes
Jeudi 31 mai 2018, 10 heures 30, Salle 3052
Olivier Laurent (ENS Lyon) Balade entre logiques linéaires classiques et intuitionnistes

On discutera différentes manières de relier syntaxiquement la logique linéaire classique (LL) et sa version intuitionniste (ILL) : d'une part à travers la question de la conservativité de LL sur ILL (contre-exemples et conditions suffisantes), et d'autre part via l'étude de non-non traductions de LL dans ILL. On montrera que ces traductions permettent de représenter des extensions de LL dans ILL, mais aussi qu'elles peuvent être utilisées comme outil pour prouver des propriétés de LL : élimination des coupures et focalisation par exemple. Ces différents résultats ont été formalisés en Coq à l'aide de la bibliothèque Yalla, que nous présenterons brièvement.

Preuves, programmes et systèmes
Jeudi 29 mars 2018, 14 heures, Salle 3052
Valentin Blot (LRI) Realizability: denotational semantics for correctness

There exists a large spectrum of techniques for proving correctness of computer programs. At one end is the Curry-Howard isomorphism, in which a typing procedure automatically checks the correctness of a program, and at the other end are techniques à la Hoare, in which non-trivial reasonning is performed in a logic adapted to the programming language. Realizability can be seen as a mixture of these two approaches: proving that a program realizes a specification is done via a combination of automatic techniques in a Curry-Howard fashion, and manual proofs of correctness of subprograms with respect to some axioms. In my talk I will present some realizability models where the realizability relation is defined semantically: a program realizes a specification if its denotation satisfies a property in the model. The denotational models involved include game semantics, continuation models and Scott domains.

Preuves, programmes et systèmes
Jeudi 29 mars 2018, 10 heures 30, Salle 3052
Guilhem Jaber (LIP) Model-checking contextual equivalence of higher-order programs with references.

This talk will present SyTeCi, the first general automated tool to check contextual equivalence for programs written in a typed higher-order language with references (i.e. local mutable states), corresponding to a fragment of OCaml.

After introducing the notion of contextual equivalence, we will see on some examples why it is hard to prove such equivalences (reentrant calls, private states). As we will see, such examples can be found in many different programming languages.

Then, we will introduce SyTeCi, a tool to automatically check such equivalences. This tool is based on a reduction of the problem of contextual equivalence of two programs to the problem of reachability of “error states” in transition systems of memory configurations.

Contextual equivalence being undecidable (even in a finitary setting), so does the non-reachability problem for such transition systems. However, one can apply model-checking techniques (predicate abstraction, summarization of pushdown systems) to check non-reachability via some approximations. This allows us to prove automatically many non-trivial examples of the literature, that could only be proved by hand before. We will end this talk by the presentation of a prototype implementing this work.

Preuves, programmes et systèmes
Jeudi 22 février 2018, 10 heures 30, Salle 3052
Pierre-Marie Pédrot (Max Planck Institute for Software Systems, Saarbrücken, Germany) Failure is Not an Option, or The Curry-Howard-Shadok correspondence

No one is without knowing the famous Shadok principle « the more it fails, the more likely it will eventually succeed. » Taking inspiration from this unfathomable wisdom, we came up with a dependent type theory which allows failure, so that we could succeed more in proving things. Such a Shadok theory is justified by translating it away into vanilla dependent theory, just as a mundane compiler would, where failing can be interpreted as a call-by-name exception mechanism. En passant, this gives the first full syntactical model of the Calculus of Inductive Constructions introducing effects.

Alas! The right to fail succeeds a tad too much, insofar as the resulting Shadock type theory is logically inconsistent. Not being impressed in any way, we put order into this madness by requiring that no exception should ever reach toplevel, thanks to a clever use of a variant of Bernardy-Lasson syntactic parametricity. While the former model can be thought of as Friedman A-translation applied to CIC, the latter is no more than a principled variant of Kreisel's modified realizability that scales to dependent types. In particular, it readily gives a model of CIC that still has canonicity, strong normalization and decidable type-checking, while featuring new principles typical of modified realizability such as the independence of premises and unprovability of Markov's principle.

https://www.pédrot.fr/articles/exceptional.pdf

Preuves, programmes et systèmes
Jeudi 15 février 2018, 10 heures 30, Salle 3052
Prakash Panangaden (McGill University) Topology, Order and Causal Structure

The causal structure of spacetime defines a (pair of) natural order structures on the underlying set of events. Much of the analysis of cauasl structure involves a delicate interplay between order, topology and geometry. In view of the fundamental role of the causal order in certain approaches to quantum gravity as well as its fundamental role in concurrency theory one can ask whether the topology can be derived from pure order theoretic considerations.

In a remarkable example of serendipity, order theory has been developed by computer scientists and mathematicians in order to capture computability concepts. Dana Scott developed a notion of a continuous lattice or continuous poset with a view to capturing computability as continuity with a suitable topology that has come to be known as the Scott topology. This subject has acquired the name of ``domain theory.''

We applied domain theory to the problem of reconstructing the spacetime topology from the order and came up with a number of results about reconstruction of spacetime structure from just a countable dense set.

We prove that a globally hyperbolic spacetime with its causality relation is a bicontinuous poset whose interval topology is the manifold topology. From this one can show that from only a countable dense set of events and the causality relation, it is possible to reconstruct a globally hyperbolic spacetime in a purely order theoretic manner. The ultimate reason for this is that globally hyperbolic spacetimes belong to a category that is equivalent to a special category of domains called interval domains.

This was joint work with Keye Martin of Naval Research Laboratories.

Preuves, programmes et systèmes
Jeudi 8 février 2018, 10 heures 30, Salle 3052
Vincent Laporte (IMDEA Software) Provably secure compilation of side-channel countermeasures: the case of cryptographic “constant-time”

Software-based countermeasures provide effective mitigation against side-channel attacks, often with minimal efficiency and deployment overheads. Their effectiveness is often amenable to rigorous analysis: specifically, several popular countermeasures can be formalized as information flow policies, and correct implementation of the countermeasures can be verified with state-of-the-art analysis and verification techniques. However, in absence of further justification, the guarantees only hold for the language (source, target, or intermediate representation) on which the analysis is performed.

We consider the problem of preserving side-channel countermeasures by compilation for cryptographic “constant-time”, a popular countermeasure against cache-based timing attacks. We present a general method, based on the notion of 2-simulation, for proving that a compilation pass preserves the constant-time countermeasure. Using the Coq proof assistant, we verify the correctness of our method and of several representative instantiations.

Joint session Verification/PPS

Preuves, programmes et systèmes
Jeudi 8 février 2018, 10 heures 30, Salle 3052
Séminaire Chocola (ENS Lyon) Rencontres Chocola de Février: Prakash Panangaden, Justin Hsu & Thomas Ehrhard

http://chocola.ens-lyon.fr/events/meeting-2018-02-08

10:30 – 12:00

  Prakash Panangaden (Mc Gill University, Canada)
  Quantitative Equational Logic 

14:00 – 15:00

  Justin Hsu (University College London, UK)
  From Couplings to Probabilistic Relational Program Logics 

15:30 – 16:30

  Thomas Ehrhard (IRIF, Univ. Paris Diderot)
  Stable and measurable functions on positive cones: a model of probabilistic functional languages with continuous types

Preuves, programmes et systèmes
Jeudi 25 janvier 2018, 10 heures 30, Salle 3052
Justin Hsu (University College of London) From Couplings to Probabilistic Relational Program Logics

Many program properties are relational, comparing the behavior of a program (or even two different programs) on two different inputs. While researchers have developed various techniques for verifying such properties for standard, deterministic programs, relational properties for probabilistic programs have been more challenging. In this talk, I will survey recent developments targeting a range of probabilistic relational properties, with motivations from privacy, cryptography, machine learning. The key idea is to meld relational program logics with an idea from probability theory, called a probabilistic coupling. The logics allow a highly compositional and surprisingly general style of analysis, supporting clean proofs for a broad array of probabilistic relational properties.

Bio: Justin Hsu is a post-doctoral researcher at the University College of London. He obtained his graduate degree from the University of Pennsylvania. His research interests span formal verification and theoretical computer science, including verification of randomized algorithms, differential privacy, and game theory.


Année 2017

Preuves, programmes et systèmes
Jeudi 14 décembre 2017, 10 heures 30, ENS Lyon
Séminaire Chocola ENS Lyon

Preuves, programmes et systèmes
Jeudi 14 décembre 2017, 10 heures 30, Salle 3052
Juliusz Chroboczek (IRIF, Université Paris Diderot) Homenet, l'IETF, et le processus de normalisation

Depuis plusieurs années, je participe au groupe de travail Homenet de l'IETF, l'organisme qui définit les normes qui régissent l'Internet. Dans cet exposé, sur l'exemple de Homenet et de mes aventures à l'IETF, j'essaierai de donner une idée de ce qu'est le processus de normalisation, pourquoi il est important pour l'Informatique et pour l'Internet.

Cet exposé est une version étendue et légèrement censurée de l'exposé que j'ai donné aux Journées PPS le 12 octobre 2017.

Preuves, programmes et systèmes
Jeudi 7 décembre 2017, 10 heures 30, Salle 3052
Charles Grellois (Université Aix-Marseille) Linearity in Higher-Order Recursion Schemes

In higher-order model-checking (HOMC), functional programs are represented as higher-order recursion schemes (HORS), a kind of grammar with parameters which can be functions. These grammars generate infinite trees, over which we want to check formulas of monadic second-order logic (MSO). This problem is decidable, as proved first by Ong in 2006, and then by many others including Kobayashi and Ong (2009) whose approach use intersection types. It turns out that linear logic is a powerful and enlightening tool to reason about HOMC, as shown by Grellois and Melliès in the last years.

In general, the complexity of the HOMC problem is n-EXPTIME, where n is the order of the HORS of interest. In this talk, we explain that we can refine this complexity measure: motivated by linear logic, we introduce *linear* HORS and a linear-nonlinear model of automaton checking fragments of MSO. We then show that HOMC is in fact n-EXPTIME complete for n the *linear* order of the linear HORS generating the tree of interest. We believe that this explains why HOMC, in spite of its huge theoretical complexity, has been successfully used in practice by Kobayashi's team.

This linear framework allows to reprove in a much simpler way three existing results extending the usual HOMC problem, and notably to deal with call-by-value programs in a smooth way (the usual HOMC approach considering call-by-name).

This is joint work with Pierre Clairambault and Andrzej Murawski.

Preuves, programmes et systèmes
Jeudi 23 novembre 2017, 10 heures 30, Salle 3052
Simon Castellan (Imperial College) The parallel intensionally fully abstract model for PCF

In this talk, we introduce a new game semantics framework for concurrency based on event structures, extending the work of Rideau and Winskel. In this framework, we can extend the notions of innocence and well-bracketing to the concurrent (and non-deterministic) case, generalizing the so-called “Abramsky cube”.

This talk focuses on the deterministic case. I will first introduce the concurrent strategies and their composition, in the existing linear setting. I will then present our extension to nonlinearity using copy indices and symmetry to represent uniformity. I will then present our notions of concurrent innocence & well-bracketing, to finish on our result of intensional full abstraction for PCF. Time permitting, I will discuss extensions of this result to non-angelic nondeterminism and probabilities.

Preuves, programmes et systèmes
Jeudi 9 novembre 2017, 10 heures 30, ENS Lyon
Séminaire Chocola ENS LYon

Preuves, programmes et systèmes
Jeudi 19 octobre 2017, 10 heures 30, Salle 3052
Martin Hyland (DPMMS, University of Cambridge) Understanding Computation in Game Semantics

Preuves, programmes et systèmes
Jeudi 15 juin 2017, 10 heures 30, Salle 3052
Samuele Giraudo (Paris-Est Marne-la-Vallée) Découpage d'associativité généralisé

Les algèbres dendriformes sont des structures algébriques introduites par Loday. Elle offrent un moyen de découper un produit associatif en deux produits non nécessairement associatifs. L'étude de ces algèbres, réalisée avec le point de vue fourni par la théorie des opérades, fait apparaître la combinatoire des arbres binaires et des mélanges d'arbres. Nous définissons ici une opérade à un paramètre entier généralisant l'opérade diassociative. Par dualité de Koszul, nous obtenons des généralisations de l'opérade dendriforme. Les algèbres sur ces opérades permettent de découper un produit associatif en plusieurs parties, avec certaines relations de compatibilité. Les propriétés combinatoires et algébriques de ces structures sont passées en revue.

Preuves, programmes et systèmes
Mardi 9 mai 2017, 14 heures, Salle 3052
Valentin Blot (Queen Mary University, London) An interpretation of system F through bar recursion

There are two possible computational interpretations of second-order arithmetic: Girard's system F or Spector's bar recursion and its variants. While the logic is the same, the programs obtained from these two interpretations have a fundamentally different computational behavior and their relationship is not well understood. We make a step towards a comparison by defining the first translation of system F into a simply-typed total language with a variant of bar recursion. This translation relies on a realizability interpretation of second-order arithmetic. Due to Gödel's incompleteness theorem there is no proof of termination of system F within second-order arithmetic. However, for each individual term of system F there is a proof in second-order arithmetic that it terminates, with its realizability interpretation providing a bound on the number of reduction steps to reach a normal form. Using this bound, we compute the normal form through primitive recursion. Moreover, since the normalization proof of system F proceeds by induction on typing derivations, the translation is compositional. The flexibility of our method opens the possibility of getting a more direct translation that will provide an alternative approach to the study of polymorphism, namely through bar recursion.

Preuves, programmes et systèmes
Jeudi 27 avril 2017, 10 heures 30, Salle 3052
Bérénice Delcroix-Oger (Institut de Mathématiques de Toulouse) Des arbres sans ambiguités

Les tableaux boisés sont des objets combinatoires intervenant notamment dans l'étude de certains modèles de mécanique statistique. Cet exposé porte sur les arbres non ambigus, qui sont un cas particulier de tableaux boisés reliés aux permutations ayant leurs excédences (w(i)>i) en début de mot. Nous avons obtenu, lors d'un travail commun avec J.-C. Aval, A. Boussicault, F. Hivert et P. Laborde-Zubieta, des résultats énumératifs et bijectifs sur ces objets, que je présenterai ici après avoir introduit le cadre de cette étude.

Preuves, programmes et systèmes
Jeudi 27 avril 2017, 14 heures, Salle 3052
Jeremy Siek (Indiana University) The state of the art in gradual typing

Gradual typing is an approach for designing programming languages that integrate static and dynamic type checking. Gradual typing gives the programmer fine-grained control over which regions of a program are statically checked and which regions are dynamically checked. Over the last decade, there has been renewed interest in such an integration partly due to the rise in popularity of dynamic languages. But as small programs grow into large programs, so does the need for early error detection and modularity, which is provided by static type checking. Gradual typing provides a practical migration path for dynamically typed programs to become more statically typed. This talk will give a glimpse at the state of the art in gradual typing. It will describe a) the major challenges in the design and implementation of gradually typed languages, b) the research that has addressed many of of these challenges, and c) the open problems that need to be solved. The research in gradual typing spans both theoretical and practical questions, from mechanized metatheory to efficient compilation.

Jeremy Siek is an Associate Professor at Indiana University Bloomington. Jeremy's areas of research include programming language design, type systems, mechanized theorem proving using proof assistants, and optimizing compilers. Jeremy's Ph.D. thesis explored foundations for constrained templates, aka the “concepts” proposal for C++. Prior to that, Jeremy developed the Boost Graph Library, a C++ generic library for graph algorithms and data structures. Jeremy post-doc'd at Rice University where he developed the idea of gradual typing: a type system that integrates both dynamic and static typing in the same programming language. Jeremy is currently working on a gradually-typed version of Python. Jeremy received the NSF CAREER award to fund his project: “Bridging the Gap Between Prototyping and Production”. Jeremy was also twice awarded a Distinguished Visiting Fellowship from the Scottish Informatics & Computer Science Alliance.

Preuves, programmes et systèmes
Jeudi 20 avril 2017, 10 heures 30, Salle 3052
Jamie Vicary (Univ. of Oxford) Formalizing Compositional Proofs

I will present a new proof assistant, Globular, which allows the formalization of compositional proofs in higher category theory. The interface is geometrical, with proofs depicted as string diagrams, and allowing direct click-and-drag manipulation of proofs in a simple and intuitive way, up to the level of 4-categories. I will show how to build a simple proof from scratch, and present some sophisticated formalized proofs from topology and algebra. I will also give some details of the theoretical basis, which gives a new and simple way to understand higher category theory. This talk will be accessible, and will not require any previous knowledge.

Web site: https://ncatlab.org/nlab/show/Globular Paper: https://arxiv.org/abs/1610.06908

Preuves, programmes et systèmes
Jeudi 30 mars 2017, 10 heures, Salle 3052
Giovanni Bernardi (IRIF) Un, personne et cent mille: a meta theory for testing equivalences?

Testing theory focuses on contextual equivalences that were proposed in the 80s as an alternative to bisimulation equivalence. During the last decade testing equivalences proved useful in constructing semantic models of session types and in laying the foundations of web-service technologies. As result, testing theory is more useful and richer than ever. In this seminar I will recall the chief ideas behind testing theory, and argue that we lack a general methodology to reason on testing equivalences. I will also outline the evidence that a meta-theory may exist, and some open questions.

Preuves, programmes et systèmes
Jeudi 30 mars 2017, 11 heures 15, Salle 3052
Daniela Petrisan (IRIF) Hybrid set-vector automata from a category-theoretic perspective

The main purpose of this talk is to introduce a new automata model, hybrid set-vector automata, that naturally embed deterministic finite state automata and finite automata weighted over a field.

We take a category-theoretic approach, which provides a neat understanding of minimisation. It is well known that category theory offers a unifying view of some automata theory results. For example, minimisation of deterministic automata (over finite words) and Shützenberger’s automata weighted over fields, arise from the same categorical reasons.

In the first part of the talk, I will discuss about how to model and minimise automata in categories. Traditionally, automata are seen either as algebras for a functor plus a final map, possibly in a monoidal category, or as coalgebras for a functor plus an initial map. We propose yet another view of automata as functors from an input category to an output category.

The new hybrid set-vector automata can be modelled by taking the output category to be a free-colimit completion of the category of finite-dimensional vector spaces under a certain class of colimits.

This is joint work with Thomas Colocombet.

Preuves, programmes et systèmes
Jeudi 23 mars 2017, 10 heures 30, Salle 3052
Thomas Leventis (Institut de Mathématiques de Marseille) Full Abstraction of the Probabilistic Böhm Trees

A very natural notion of equivalence of programs is the observational equivalence: two programs are equivalent if they have the same behaviour in any environment. More specifically in the lambda-calculus two terms M and N are equivalent if for every context C, the terms C[M] and C[N] are either both solvable or both unsolvable. Respecting this equivalence is an important property of some models of the lambda-calculus, called the full abstraction. It is well known that the model of infinitely extensional Böhm trees is fully abstract. The question we are interested in is what becomes of these notions when we introduce probabilistic choice in the calculus.

The solvability has a natural extension, which is the convergence probability, hence two terms are observationally equivalent if they have the same convergence probability under any context. There is also a simple way to define probabilistic Bohm trees, as we can associate to any term a subprobability distribution over head normal forms. But it is not obvious that these generalizations of the deterministic notions are still equivalent. In this talk we will show how to prove that probabilistic Böhm trees actually form a model, and how to get a separability result to ensure this model is fully abstract.

Preuves, programmes et systèmes
Jeudi 16 mars 2017, 10 heures 30, Salle 3052
Charles Grellois (INRIA - Univ. Bologna (Italie)) Verifying properties of functional programs: from the deterministic to the probabilistic case

In functional programs, also called higher-order programs, functions may take functions themselves as arguments. As a result, their model-checking relies in most approaches on semantic or type-theoretic tools. In this talk, I will explain how an analysis based on linear logic of a model-checking result of 2009 by Kobayashi and Ong led Melliès and I to the construction of a model for model-checking. This model is such that, when interpreting a term with recursion representing the tree of traces of a functional program, its denotation determines whether it satisfies a MSO property of interest. A related and similar model was obtained independently by Salvati and Walukiewicz. In the second part of the talk, I will discuss the verification of termination for functional programs with recursion and probabilistic choice. Dal Lago and I defined recently a type system which is such that typable programs terminate with probability 1. In other terms, their set of diverging executions is negligible. If time allows, I will sketch ideas towards an extension of the model-checking results of the deterministic case to quantitative logics and functional programs with recursion.

Preuves, programmes et systèmes
Jeudi 23 février 2017, 10 heures 30, Salle 3052
Marcelo Fiore (University of Cambridge) An Algebraic Combinatorial Approach to the Abstract Syntax of Opetopic Structures

The starting point of the talk will be the identification of structure common to tree-like combinatorial objects, exemplifying the situation with abstract syntax trees~(as used in formal languages) and with opetopes (as used in higher-dimensional algebra). The emerging mathematical structure will be then formalized in a categorical setting, unifying the algebraic aspects of the theory of abstract syntax of [2,3] and the theory of opetopes of [5]. This realization conceptually allows one to transport viewpoints between these, now bridged, mathematical theories and I will explore it here in the direction of higher-dimensional algebra, giving an algebraic combinatorial framework for a generalisation of the slice construction of [1] for generating opetopes. The technical work will involve setting up a microcosm principle for near-semirings and subsequently exploiting it in the cartesian closed bicategory of generalised species of structures of [4]. Connections to (cartesian and symmetric monoidal) equational theories, lambda calculus, and algebraic combinatorics will be mentioned in passing.

[1] J.Baez and J.Dolan. Higher-Dimensional Algebra III. n-Categories and the Algebra of Opetopes. Advances in Mathematics 135, pages 145-206, 1998.

[2] M.Fiore, G.Plotkin and D.Turi. Abstract syntax and variable binding. In 14th Logic in Computer Science Conf. (LICS'99), pages 193-202. IEEE, Computer Society Press, 1999.

[3] M.Fiore. Second-order and dependently-sorted abstract syntax. In Logic in Computer Science Conf. (LICS'08), pages 57–68. IEEE, Computer Society Press, 2008.

[4] M.Fiore, N.Gambino, M.Hyland, and G.Winskel. The cartesian closed bicategory of generalised species of structures. In J. London Math. Soc.}, 77:203-220, 2008.

[5] S.Szawiel and M.Zawadowski. The web monoid and opetopic sets. In arXiv:1011.2374 [math.CT], 2010.

Preuves, programmes et systèmes
Jeudi 26 janvier 2017, 10 heures 30, Salle 3052
Tarmo Uustalu (Tallinn University of Technology) Dynamic programming and coalgebras with sharing

Dynamic programming is about exploiting sharing opportunities in recursive call trees of a function. If the pattern of sharing opportunities is known statically, a call dag can be built instead of a tree from the start.

In this talk, we present a generic framework for doing so and demonstrate it in action on two textbook examples of dynamic programming - Fibonacci and edit distance. We describe sharing opportunity patterns with systems of equations between node addresses. A technique from term rewriting systems, namely Knuth-Bendix completion, gives us unique normal forms for node addresses. A dag is represented as a spanning tree whose nodes are defined by addresses in normal form. Mathematically, the function taking an input for our function of interest into a call tree is a comonad coalgebra, node addresses are coterms and equations are coequations.

A Haskell implementation of this framework uses circular programming and a generic implementation of derivatives of functors.

We view this work as a showcase of how basic algorithmics and more advanced category theory can interact in a mutually beneficial way.

This joint work with Nicolas Wu, University of Bristol.

Preuves, programmes et systèmes
Mardi 10 janvier 2017, 11 heures, Salle 3052
Camell Kachour (IRIF) Sur des modèles algébriques d'infini-n-catégories faibles cubiques


Année 2016

Preuves, programmes et systèmes
Jeudi 1 décembre 2016, 10 heures 30, Salle 3052
Julien Lange (Imperial College) Building Graphical Choreographies From Communicating Machines: Principles and Applications

Graphical choreographies, or global graphs, are general multiparty session specifications featuring expressive constructs such as forking, merging, and joining for representing application-level protocols. Global graphs can be directly translated into modelling notations such as BPMN and UML. In the first part of the talk, I will first present an algorithm whereby a global graph can be constructed from asynchronous interactions represented by communicating finite-state machines (CFSMs); and a sound characterisation of a subset of safe CFSMs from which global graphs can be constructed. In the second part, I will outline a few recent applications of this work to communicating timed automata and the Go programming language.

Preuves, programmes et systèmes
Jeudi 24 novembre 2016, 10 heures 30, Salle 3052
Thibaut Balabonski (LRI, Université Paris Sud) Optimisation de programmes C++ concurrents

Preuves, programmes et systèmes
Jeudi 17 novembre 2016, 10 heures 30, Salle 3052
Bruno Barras Exposé repoussé à début 2017

Preuves, programmes et systèmes
Jeudi 3 novembre 2016, 10 heures 30, Salle 3052
Guilhem Jaber (IRIF) Operational Nominal Game Semantics

We introduce a fully-abstract intensional model for a polymorphic call-by-value language with higher-order references. As in game semantics, the denotation of a term is represented as a set of plays between the term and its environment. But rather than building it compositionally, by induction over the term, we generate it using a labelled transition system representing all the possible interactions between the term and any environment. Names, and more generally the theory of nominal sets, are crucially used to represent locations (i.e. memory addresses) and functional and polymorphic values. Freshness of such names then control the observational power of environments. Thanks to the presence of references, the observational power of environments is strong enough to avoid the need of performing a quotient of the model to be fully abstract. This gives us new principles to reason on effectul and polymorphic programs. This work has been done in collaboration with Nikos Tzevelekos (QMUL).

Preuves, programmes et systèmes
Jeudi 6 octobre 2016, 10 heures 30, Salle 3052
Giulio Manzonetto (LIPN) New Results on Morris's Observational Theory — The Benefits of Separating the Inseparable

We study the theory of contextual equivalence in the untyped lambda-calculus, generated by taking the normal forms as observables. Introduced by Morris in 1968, this is the original extensional lambda theory H+ of observational equivalence. On the syntactic side, we show that this lambda-theory validates the omega-rule, thus settling a long-standing open problem. On the semantic side, we provide sufficient and necessary conditions for relational graph models to be fully abstract for H+. We show that a relational graph model captures Morris's observational pre-order exactly when it is extensional and lambda-König. Intuitively, a model is lambda-König when every lambda-definable tree has an infinite path which is witnessed by some element of the model.

Preuves, programmes et systèmes
Jeudi 29 septembre 2016, 10 heures 30, Salle 3052
Vincent Danos (ENS) Bayesian inversion by ω-complete cone duality

The process of inverting Markov kernels relates to the important subject of Bayesian modelling and learning. In fact, Bayesian update is exactly kernel inversion. ​We investigate how and when Markov kernels (aka stochastic relations, or probabilistic mappings) can be inverted. We address the question both directly on the category of measurable spaces, and indirectly by interpreting kernels as Markov operators: For the direct option, we introduce a typed version of the category of Markov kernels and use the so-called ‘disintegration of measures’. Here, one has to specialise to measurable spaces borne from a simple class of topological spaces -e.g. Polish spaces (other choices are possible). Our method and result greatly simplify a recent development ​by Culbertson and Sturz. For the operator option, we use a cone version of the category of Markov operators (​aka ​kernels seen as predicate transformers). That is to say, our linear operators are not just continuous, but are required to satisfy the stronger condition of being ω-chain-continuous.​ ​Prior work shows that one obtains an adjunction in the form of a pair of contravariant and inverse functors between the categories of L1- and L∞-cones. Inversion, seen through the operator prism, is just adjunction.​ ​No topological assumption is needed. We​ ​show​ ​that​ ​both​ ​categories​ ​(Markov​ ​kernels​ ​and​ ​ω-chain-continuous Markov operators) are related by a family of contravariant functors Tp for 1 ≤ p ≤ ∞. The Tp’s are Kleisli extensions of (duals of) conditional expectation functors introduced ​before. We​ prove​ ​that​ ​both​ ​notions​ ​of​ ​inversion agree when both defined: if f is a kernel, and f† its direct inverse, then T∞(f)† = T1(f†). This is a joint work with Fredrik Dahlqvist UCL, Ohad Kammar Oxford, Ilias Garnier ENS

Preuves, programmes et systèmes
Jeudi 2 juin 2016, 10 heures 30, Salle 3052
Ugo Dal Lago (Univ. Bologna) Infinitary Lambda Calculi from a Linear Perspective

We introduce a linear infinitary lambda-calculus in which two exponential modalities are available, the first one being the usual, finitary one, the other being the only construct interpreted coinductively. The obtained calculus embeds the infinitary applicative lambda-calculus and is universal for computations over infinite strings. What is particularly interesting about the calculus is that the refinement induced by linear logic allows to restrict both modalities so as to get calculi which are terminating inductively and productive coinductively. We exemplify this idea by isolating a fragment of the calculus around the principles of SLL and 4LL. Interestingly, it enjoys confluence, contrarily to what happens in ordinary infinitary lambda-calculi.

Preuves, programmes et systèmes
Jeudi 21 avril 2016, 10 heures 30, Salle 3052
Silvia Ghilezan (Université de Novi Sad) Preciseness of Subtyping on Intersection and Union Types

The notion of subtyping has gained an important role both in theoretical and applicative domains: in lambda and concurrent calculi as well as in programming languages. The soundness and the completeness, together referred to as the preciseness of subtyping, can be considered from two different points of view: denotational and operational. The former preciseness is based on the denotation of a type which is a mathematical object that describes the meaning of the type in accordance with the denotations of other expressions from the language. The latter preciseness has been recently developed with respect to type safety, i.e. the safe replacement of a term of a smaller type when a term of a bigger type is expected.

We propose a technique for formalising and proving operational preciseness of the subtyping relation in the setting of a concurrent lambda calculus with intersection and union types. The key feature is the link between typings and the operational semantics. We prove then sound- ness and completeness getting that the subtyping relation of this calculus enjoys both denotational and operational preciseness.

This is a joint work with Mariangiola Dezani-Ciancaglini.

Preuves, programmes et systèmes
Mercredi 30 mars 2016, 10 heures 30, Salle 3052
Guillaume Munch-Maccagnoni (Cambridge Computer lab) Enriched-adjunction models and polarisation for modelling effects and resources

(joint work with Marcelo Fiore and Pierre-Louis Curien)

I will present “linear call-by-push-value” enriched-adjunction models refining call-by-push-value models of effects and linear/non-linear models of linear logic, with higher-order functions, sums, and resource modalities, together with a theorem lifting linear models into cartesian ones. I will also present computational interpretations of these models as intuitionistic (linear) logics (LJ/ILL) where the order of evaluation matters, in the form of polarised calculi that satisfy usual properties of Barendregt-style lambda-calculus, and that have sound and coherent interpretations in the previous models. This suggests an approach to modelling proofs and programs with direct models and calculi.

Preuves, programmes et systèmes
Jeudi 24 mars 2016, 10 heures 30, Salle 3052
Pawel Sobocinski (University of Southampton) Non encore annoncé.

Preuves, programmes et systèmes
Jeudi 10 mars 2016, 10 heures 30, Salle du Conseil
Thomas Seiller (Department of Computer Science, University of Copenhagen - DIKU) Complexity Constraints as Group Actions

The purpose of this talk is to explain a new approach to complexity theory based on (dynamic) semantics of linear logic, whose aim is to enable techniques and invariants from ergodic theory (e.g. l^2-Betti numbers of a countable Borel equivalence relation) to be used in computational complexity.

The origins of the techniques can be traced to Girard's “geometry of interaction” (GoI) program using von Neumann algebras and the recent GoI-inspired results in complexity. However, this approach reaches its full strength when using the more combinatorial setting of Interaction Graphs models of (fragments of) linear logic. Using techniques akin to game semantics (with a bit of measure theory), we are able to characterise (predicate) complexity classes as the set of programs/proofs interpretations of type Pred[m] := Nat ⇒ Bool. These models are parametrised by a group of measure-preserving maps m (equivalently, by a measurable group action) and provide the first sketches of a conjectured correspondence between measurable group actions and complexity constraints.

Preuves, programmes et systèmes
Jeudi 14 janvier 2016, 10 heures 30, Salle 3052
Amar Hadzihasanovic (Oxford University) String diagrams and the algebra of entanglement

The GHZ and W states are two entangled quantum states of three qubits, that are inequivalent in the sense that one cannot be turned into the other by local (single-qubit) operations; this is reflected in their different communicational properties and use in cryptographic protocols. A few years ago, Coecke and Kissinger showed that one can associate, to the two states, two Frobenius algebras in the category of Hilbert spaces - a type of algebra with a well-understood string diagram representation, which could hopefully provide a bridge between algebraic, computational and topological aspects of quantum entanglement. We present a complete graphical axiomatisation of the relations between the GHZ and W states/algebras: the ZW calculus. This calculus refines the pre-existing ZX calculus, while keeping its most desirable characteristics, such as the undirectedness of diagrams; comes with an explicit normalisation procedure; provides an original decomposition of the category of qubits, with a prominent “fermionic” fragment; and hints at a topological explanation of its components and axioms.