### Book

(2015) Qu’est-ce que la Mécanique Quantique ?, Paris: Vrin, Coll. "Chemins Philosophiques", 128 p. [at Vrin].

What is Quantum Mechanics? (in French)

Abstract: La mécanique quantique est une théorie physique contemporaine réputée pour ses défis au sens commun et ses paradoxes. Depuis bientôt un siècle, plusieurs interprétations de la théorie ont été proposées par les physiciens et les philosophes, offrant des images quantiques du monde, ou des ontologies, radicalement différentes. L'existence d'un hasard fondamental, ou d'une multitude de mondes en-dehors du nôtre, dépend ainsi de l'interprétation adoptée. Après avoir discuté de la définition de l'interprétation d'une théorie physique, ce livre présente trois principales interprétations quantiques, empiriquement équivalentes : l'interprétation dite orthodoxe, l'interprétation de Bohm, et l'interprétation des mondes multiples. Des textes d'Albert & Galchen, ainsi que de Mermin, présentent le concept de non-localité et invitent à une analyse de l'argument d'Einstein-Podolsky-Rosen et du théorème de Bell.

Reviewed in l'Oeil de Minerve (in French).

### Edited volume

(2017) Scientific Collaboration and Collective Knowledge, (edited with Conor Mayo-Wilson and Michael Weisberg) New York: Oxford University Press. [pdf][at OUP]

Abstract: It is not unusual for projects in contemporary science that hundreds of researchers join efforts to get a single result. Despite its growing prevalence and importance, there is relatively little philosophical work analyzing collaborative research in the sciences. This book aims at filling such a gap, gathering novel contributions on the matter from internationally-recognized philosophers. Conceptual and normative questions are dealt with, and formal methods are sometimes used.

Reviewed in Notre Dame Philosophical Reviews, Philosophy of Science.

### Journal articles

#### Articles in international peer-reviewed journals

(forthcoming) "Comment: The Precautionary Principle and Judgment Aggregation", Risk Analysis.

Abstract: In a recent paper in this Journal, Stefánsson proves some impossibility results for the Precautionary Principle. I challenge the scope of these results with reasons coming from judgment aggregation theory, a research field which studies how a group of individuals should consistently aggregate their individual opinions on interrelated judgments.

(forthcoming) "Improving deliberations by reducing misrepresentation effects" (with Cyrille Imbert, Vincent Chevrier and Christine Bourjot), Episteme.

Abstract: Deliberative and decisional groups play crucial roles in most aspects of social life. But it is not obvious how to organize these groups and various socio-cognitive mechanisms can spoil debates and decisions. In this paper we focus on one such important mechanism: the misrepresentation of views, i.e. when agents express views that are aligned with those already expressed, and which differ from their private opinions. We introduce a model to analyze the extent to which this behavioral pattern can warp deliberations and distort the decisions that are finally taken. We identify types of situations in which misrepresentation can have major effects and investigate how to reduce these effects by adopting appropriate deliberative procedures. We discuss the beneficial effects of (i) holding a sufficient number of rounds of expression of views; (ii) choosing an appropriate order of speech, typically a random one; (iii) rendering the deliberation dissenter-friendly; (iv) having agents express fined-grained views. These applicable procedures help improve deliberations because they dampen conformist behavior, give epistemic minorities more opportunities to be heard, and reduce the number of cases in which an inadequate consensus or majority develops.

(2019) "Scientific expertise and risk aggregation", Philosophy of Science 86(1): 124-144.

Abstract: When scientists are asked to give expert advice on risk-related questions, such as the authorization of medical drugs, deliberation often does not eliminate all disagreements. I propose to model these remaining discrepancies as differences in risk assessments and/or in risk acceptability thresholds. The normative question I consider, then, is how the individual expert views should best be aggregated. I discuss what "best" could mean, with an eye to some robustness considerations. I argue that the majority rule, which is currently often used in expert panels, has significant drawbacks.

(2017) "The Precautionary Principle has not been shown to be incoherent: A reply to Peterson", Risk Analysis, 37(11): 2039-2040. [doi]

Abstract: In this Journal, I have objected to Peterson's 2006 claim that the Precautionary Principle is an incoherent decision rule. I defend my objections to Peterson's recent replies, and I still claim that the Precautionary Principle has not been shown to be incoherent.

(2017) "Is the Precautionary Principle really incoherent?", Risk Analysis, 37(11): 2026-2034. [doi]

Abstract: The Precautionary Principle has been an increasingly important principle in international treaties since the 1980's. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This paper concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision-making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again.

(2016) "Quantum-like models cannot account for the conjunction fallacy" (with Sébastien Duchêne and Éric Guerci), Theory and Decision 81(4): 479-510. [doi]

Abstract: Human agents happen to judge that a conjunction of two terms is more probable than one of the terms, in contradiction with the rules of classical probabilities---this is the conjunction fallacy. One of the most discussed accounts of this fallacy is currently the quantum-like explanation, which relies on models exploiting the mathematics of quantum mechanics. The aim of this paper is to investigate the empirical adequacy of major quantum-like models which represent beliefs with quantum states. We first argue that they can be tested in three different ways, in a question order effect configuration which is different from the traditional conjunction fallacy experiment. We then carry out our proposed experiment, with varied methodologies from experimental economics. The experimental results we get are at odds with the predictions of the quantum-like models. This strongly suggests that this quantum-like account of the conjunction fallacy fails. Future possible research paths are discussed.

(2016) "Testing quantum-like models of judgment for question order effect" (with Sébastien Duchêne and Éric Guerci), Mathematical Social Sciences 80: 33-46. [doi]

Abstract: Lately, so-called "quantum" models, based on parts of the mathematics of quantum mechanics, have been developed in decision theory and cognitive sciences to account for seemingly irrational or paradoxical human judgments. We consider here some such quantum-like models that address question order effects, i.e. cases in which given answers depend on the order of presentation of the questions. Models of various dimensionalities could be used; can the simplest ones be empirically adequate? From the quantum law of reciprocity, we derive new empirical predictions that we call the Grand Reciprocity equations, that must be satisfied by several existing quantum-like models, in their non-degenerate versions. Using substantial existing data sets, we show that these non-degenerate versions fail the GR test in most cases, which means that, if quantum-like models of the kind considered here are to work, it can only be in their degenerate versions. However, we suggest that the route of degenerate models is not necessarily an easy one, and we argue for more research on the empirical adequacy of degenerate quantum-like models in general.

(2015) "Scientific collaboration: do two heads need to be more than twice better than one?" (with Cyrille Imbert), Philosophy of Science, 82(4): 667-688.

Abstract: Epistemic accounts of scientific collaboration usually assume that, one way or another, two heads really are more than twice better than one. We show that this hypothesis is unduly strong. We present a deliberately crude model with unfavorable hypotheses. We show that, even then, when the priority rule is applied, large differences in successfulness can emerge from small differences in efficiency, with sometimes increasing marginal returns. We emphasize that success is sensitive to the structure of competing communities. Our results suggest that purely epistemic explanations of the efficiency of collaborations are less plausible but have much more powerful socio-epistemic versions.

(2014) "Layers of models in computer simulations", International Studies in the Philosophy of Science, 28(4): 417-436. [doi]

Abstract: I discuss here the definition of computer simulations, and more specifically Humphreys' (2004) views, who considers that an object is simulated when a computer provides a solution to a computational model, which in turn represents the object of interest. I argue that Humphreys' concepts are not able to analyze really successfully a case of contemporary simulations in physics, which are more complex than the examples considered so far in the philosophical literature. So, I propose to modify Humphreys' definition of a simulation. I allow for several successive layers of computational models, and I discuss the relations that exist between these models, the computer and the object under study. A consequence of my proposal is to clarify the distinction between computational models and numerical methods, and to better understand the representational and the computational functions of models in simulations.

(2014) "Is a bird in the hand worth two in the bush? Or, whether scientists should publish their intermediate results", Synthese, 191(1): 17-35. [doi]   (2013 "Young Researcher" prize of the Society of Philosophy of Sciences)

Abstract: A part of the scientific literature consists of intermediate results within a longer project. Scientists often publish a first result in the course of their work, while aware that they should soon achieve a more advanced result from this preliminary result. Should they follow the proverb "is a bird in the hand is worth two in the bush'', and publish any intermediate result they get? This is the normative question addressed in this paper. My aim is to clarify, to refine, and to assess informal arguments about the choice whether to publish intermediate results. To this end, I adopt a rational decision framework, supposing some utility or preferences, and I propose a formal model. The best publishing strategy turns out to depend on the research situation. In some simple circumstances, even selfish and short-minded scientists should publish their intermediate results, and should thus behave like their altruistic peers, i. e. like society would like them to behave. In other research situations, with inhomogeneous reward or difficulty profiles, the best strategy is opposite. These results suggest qualified philosophical morals.

(2007) "Spin chain simulations with a meron cluster algorithm" (with Wolfgang Bietenholz and Jaïr Wuilloud), International Journal of Modern Physics C 18: 1497-1511. [doi]

Abstract: We apply a meron cluster algorithm to the XY spin chain, which describes a quantum rotor. This is a multi-cluster simulation supplemented by an improved estimator, which deals with objects of half-integer topological charge. This method is powerful enough to provide precise results for the model with a $\theta$-term --- it is therefore one of the rare examples, where a system with a complex action can be solved numerically. In particular we measure the correlation length, as well as the topological and magnetic susceptibility. We discuss the algorithmic efficiency in view of the critical slowing down. Due to the excellent performance that we observe, it is strongly motivated to work on new applications of meron cluster algorithms in higher dimensions.

#### Articles in French-speaking peer-reviewed journals

(2017) "Le principe de précaution", (article ‘Grand Public’), in M. Kristanek (éd.), L'Encyclopédie Philosophique [www]
= "The precautionary principle"

Abstract: Si une certaine action peut causer une catastrophe environnementale ou sanitaire, a-t-on besoin d’être certain que ce soit le cas pour prendre des mesures et tenter d’empêcher cette catastrophe ? Le principe de précaution affirme que non : il faut agir même si les données scientifiques ne sont pas catégoriques. Depuis les années 1980, ce principe figure dans divers traités et réglementations, et est régulièrement invoqué dans les domaines de l’environnement et de la santé. Il est néanmoins controversé, certains l’accusant d’être paralysant ou anti-scientifique. Cet article fait l’état des lieux sur ce qu’est exactement le principe de précaution, la nouveauté qu’il représente, et récapitule les arguments en sa faveur et sa défaveur.

(2017) "Une nouvelle approche expérimentale pour tester les modèles quantiques de l'erreur de conjonction", La Revue Économique 5: 16-31 (actes de la conférence annuelle de l'ASFEE 2015) (with Sébastien Duchêne and Éric Guerci).
= "A new experimental approach to test quantum-like models of the conjunction fallacy"

Abstract: La théorie classique des probabilités requiert que la probabilité  de la conjonction de deux événements soit inférieure à la probabilité d'un des événements seul. Or les sujets ne jugent empiriquement pas toujours ainsi : c'est la traditionnelle erreur de conjonction. L'une des explications actuellement prometteuses de ce paradoxe repose sur des modèles dits "quantiques", développés à partir des outils mathématiques de la mécanique quantique. Mais ces modèles sont-ils empiriquement adéquats ? Quelles versions de ces modèles peuvent être employées ? En particulier, les versions les plus simples, dites non-dégénérées, peuvent-elles être suffisantes ? Nous proposons ici un protocole expérimental original pour tester en laboratoire les modèles quantiques de l'erreur de conjonction. Les résultats obtenus suggèrent que les modèles non-dégénérés ne sont pas empiriquement adéquats, et que la recherche future concernant les modèles quantiques devrait s'orienter vers les modèles dégénérés.

(2015) "Les interprétations de la mécanique quantique : une vue d'ensemble introductive", Implications Philosophiques, Sept. 2015. [on line]
= "The interpretations of quantum mechanics: an introductory overview"

Abstract: La mécanique quantique est une théorie physique contemporaine réputée pour ses défis au sens commun et ses paradoxes. Depuis bientôt un siècle, plusieurs interprétations de la théorie ont été proposées par les physiciens et les philosophes, offrant des images quantiques du monde, ou des métaphysiques, radicalement différentes. L'existence d'un hasard fondamental, ou d'une multitude de mondes en-dehors du nôtre, dépend ainsi de l'interprétation adoptée. Cet article, en s'appuyant sur le livre Boyer-Kassem (2015), Qu'est-ce que la mécanique quantique ?, présente trois principales interprétations quantiques, empiriquement équivalentes : l'interprétation dite orthodoxe, l'interprétation de Bohm, et l'interprétation des mondes multiples.

(2013) "Interpréter une théorie physique" (with Anouk Barberousse), Methodos [on line], 13 | 2013.  [doi]
= "Interpreting a physical theory"

Abstract: Les théories physiques sont aujourd'hui très mathématisées, et ce que les scientifiques manipulent pour décrire, prédire et contrôler les phénomènes, ce sont (entre autres) des équations, comportant de nombreux symboles mathématiques. Ces objets mathématiques n'ont pas de signification physique en eux-mêmes : ils ne « parlent » pas d'eux-mêmes des phénomènes. Une interprétation est nécessaire. Ce qui nous intéresse dans cet article est ainsi l'interprétation dont une théorie physique doit faire l'objet pour remplir son rôle. Nous commençons par expliciter une distinction traditionnelle : l'interprétation « pauvre » (simple instrument permettant d'assigner aux symboles de la théorie un sens physique strictement limité aux résultats des expériences) diffère de l'interprétation « riche » (laquelle compose une image du monde compatible avec la façon dont la théorie décrit mathématiquement les résultats des expériences). Notre but dans cet article est de montrer que cette distinction doit être amendée. Nous nous appuyons sur l'exemple de la Mécanique Quantique, mais la distinction se veut valable en général pour toute théorie physique.

#### Working or submitted papers

"On discrimination in health insurance" (with Sébastien Duchêne)

"Explaining scientific collaboration: a general functional account" (with Cyrille Imbert)

"Institutionalizing values in scientific expertise" (with Julie Jebeile)

"The multiplicity of scientific explanations" (with Alexandre Guay)

### Book review

Review of Daniel Steel (2015), Philosophy and the Precautionary Principle, Cambridge: Cambridge University Press.
Forthcoming in Ethics, Policy & Environment. [pdf]

"Le principe de précaution est-il bien raisonnable ?", La Vie des Idées, 25 juillet 2016 [html]

### Translation (from English to French)

Translation of "Dispelling the Quantum Spooks -- a Clue that Einstein Missed?", of H. Price and K. Wharton, in Bouton, C. and Huneman, P. (eds.) (2018), Temps de la nature et nature du temps, Paris: CNRS éditions. [pdf]

Translation of "The Division of Cognitive Labor" of P. Kitcher (1990), to appear in Bonnay, D. and Galinon, H. (eds.), Textes clés de l'épistémologie sociale, Paris : Vrin. [pdf]

### Conference reports

"Modeling epistemic and scientific groups: interdisciplinary perspectives, Nancy (25-26 Nov. 2013)", The Reasoner, Vol. 8, No. 2, February 2014, p. 16.

"Epistemic Groups and Collaborative Research in Science", Nancy (15-17 Dec. 2012)", The Reasoner, Vol. 7, No. 2, February 2013, p. 20.

"The Collective Dimension of Science, Nancy (8-10 Dec. 2011)", The Reasoner, Vol. 6, No. 2, February 2012, p. 25.

### Unpublished work

"La pluralité des interprétations d'une théorie scientifique : le cas de la mécanique quantique", doctoral disseration, defended at the University of Paris 1 Panthéon-Sorbonne in December 2011 (with Highest Honors). Under the direction of Jacques Dubucs and Anouk Barberousse.  [tel]

### Popularizing

"Lutter contre les préjugés sur la pauvreté", Revue Quart Monde, 2017/3, n°243, p. 50-52.

"Une théorie en quête de sens" (dossier Physique quantique), La Recherche, février 2017, n°520, p. 50-52.

"Qu’est-ce que la mécanique quantique ?", présentation à la Librairie Vrin, au festival Quartier du Livre, Paris Ve , 24 mai 2016.

Interviewé dans un journal de philosophie brésilien : Filogênese, 2014, 7(1) : i-xxi.

Interviewé ou cité dans des magazines scientifiques :
- Science & Vie Junior (2014), n° 109,
- Science & Vie (2012), n° 1135.