Amit Hagar Profile Picture

Amit Hagar

  • hagara@indiana.edu
  • Goodbody Hall 120
  • (812) 855-2541
  • Professor
    History and Philosophy of Science

Education

  • Ph.D., Philosophy, University of British Columbia, Vancouver, 2004

Research interests

  • Foundations of quantum and statistical mechanics Quantum Information and the Foundations of Quantum Theory I tend to resist the current hype around quantum information theory, inasmuch as it is portrayed as shedding new light on the foundational problems that saturate non relativistic quantum mechanics, and in recent papers I have exposed some fatal flaws in arguments to this end. What I do believe is that technological progress in the field of quantum information theory may clarify these foundational problems, and may even serve as basis to yet another case of experimental metaphysics, by allowing us to test decoherence based approaches against genuine collapse theories. The Origins of Probability in Statistical Mechanics I am intrigued by some old problems that the founding fathers of statistical mechanics (e.g., Maxwell, Boltzmann, Clausius) pondered with when they struggled to construct dynamical models that describe thermal phenomena. While it has become clear to them (first to Maxwell, and later also to Boltzmann and Clausius) that the thermal phenomena cannot be given a description in terms of Hamiltonian mechanics alone, and that additional probabilistic assumptions were needed, they were still ambigiuous with respect to the interpretation of these probabilistic assumptions. In a recent paper I have argued that contrary to a vocal view in the literature that regards the problem of irreversibility as the main problem in the foundations of statistical mechanics, the problem of probability, i.e., the attempt to underpin the probabilistic assumption that are necessary for the reduction of thermodynamics to statistical mechanics, is still open. Physical Computational Complexity and Quantum Computing What Is Quantum In Quantum Computing? Quantum computing has by now become a small industry, and one of the most fascinating domains of quantum mechanics today. Apart from writing an entry on this subject to the Stanford Online Encyclopedia of Philosophy, I am interested in the conceptual problems that this field raises, which touch upon the foundations of quantum theory, the applicability of mathematics to physics, and the philosophy of mind. Together with Alex Korolev from UBC, I have argued against recent attempts to "solve" recursive-theoretic undecidable problems with the quantum adiabatic algorithm. Currently I am writing a paper on another quantum halting problem which, to my mind, havily constrains the alleged superiority of quantum algorithms over their classical counterparts. Constructing the Principles: Historical and Philosophical Lessons for Future Theoretical Physics In this monograph I am using Maxwell's and Poincare's famous distinction (adopted by Einstein) between constructive and principle theories in order to draw some historical and philosophical lessons from the development of the special theory of relativity and statistical mechanics to modern scenarios such as quantum information theory and quantum gravity. Ph.D. Thesis: Chance and Time In my PhD thesis, which was also published in Israel as an expository book on the philosophy of physics, I try to show how the problem of irreversibility in the foundations of statistical mechanics and the quantum emasurement problems are actually two facets of the general philosophical problem of explaining unobserved predictions. this problem arises from the conjunction of a philosophical stance, namely that our theories describe the world, and an undisputable fact, namely that some phenomena that are predicted by our theories remain nevertheless mostly unobserved. I characterize different solutions to this problem using another philosophical problem, namely the problem of probability, and suggest a possible experimental scenario in which these solutions may be tested.

Representative publications

Discrete or continuous?: the quest for fundamental length in modern physics (2014)
Amit Hagar
Cambridge University Press.

The idea of infinity plays a crucial role in our understanding of the universe, with the infinite spacetime continuum perhaps the best-known example-but is spacetime really continuous? Throughout the history of science, many have felt that the continuum model is an unphysical idealization, and that spacetime should be thought of as' quantized'at the smallest of scales. Combining novel conceptual analysis, a fresh historical perspective, and concrete physical examples, this unique book tells the story of the search for the fundamental unit of length in modern physics, from early classical electrodynamics to current approaches to quantum gravity. Novel philosophical theses, with direct implications for theoretical physics research, are presented and defended in an accessible format that avoids complex mathematics. Blending history, philosophy, and theoretical physics, this refreshing outlook on the nature of spacetime sheds light on one of the most thought-provoking topics in modern physics.

Quantum computing (2006)
Amit Hagar and Michael Cuffaro

Combining physics, mathematics and computer science, quantum computing has developed in the past two decades from a visionary idea to one of the most fascinating areas of quantum mechanics. The recent excitement in this lively and speculative domain of research was triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speed-up” classical computation and factor large numbers into primes much more rapidly (at least in terms of the number of computational steps involved) than any known classical algorithm. Shor’s algorithm was soon followed by several other algorithms that aimed to solve combinatorial and algebraic problems, and in the last few years theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor’s algorithm on a large scale quantum computer would …

A philosopher looks at quantum information theory (2003)
Amit Hagar
Philosophy of Science, 70 (4), 752-775

Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.

Explaining the unobserved—Why quantum mechanics ain’t only about information (2006)
Amit Hagar and Meir Hemmo
Foundations of Physics, 36 (9), 1295-1324

A remarkable theorem by Clifton et al [Found Phys. 33(11), 1561–1591 (2003)] (CBH) characterizes quantum theory in terms of information-theoretic principles. According to Bub [Stud. Hist. Phil. Mod. Phys. 35 B, 241–266 (2004); Found. Phys. 35(4), 541–560 (2005)] the philosophical significance of the theorem is that quantum theory should be regarded as a “principle” theory about (quantum) information rather than a “constructive” theory about the dynamics of quantum systems. Here we criticize Bub’s principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no escape route from solving the measurement problem by constructive theories. We further propose a (Wigner-type) thought experiment that we argue demonstrates that quantum mechanics on the information-theoretic approach is incomplete.

Quantum hypercomputation—hype or computation? (2007)
Amit Hagar and Alex Korolev
Philosophy of Science, 74 (3), 347-363

A recent attempt to compute a (recursion‐theoretic) noncomputable function using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion‐theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from.

Minimal length in quantum gravity and the fate of Lorentz invariance (2009)
Amit Hagar
Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 40 (3), 259-267

The paper highlights a recent debate in the quantum gravity community on the status of Lorentz invariance in theories that introduce a fundamental length scale, and in particular in deformed special relativity. Two arguments marshaled against that theory are examined and found wanting.

Length matters: The Einstein–Swann correspondence and the constructive approach to the special theory of relativity (2008)
Amit Hagar
Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 39 (3), 532-556

I discuss a rarely mentioned correspondence between Einstein and Swann on the constructive approach to the special theory of relativity, in which Einstein points out that the attempts to construct a dynamical explanation of relativistic kinematical effects require postulating a fundamental length scale in the level of the dynamics. I use this correspondence to shed light on several issues under dispute in current philosophy of spacetime that were highlighted recently in Harvey Brown's monograph Physical Relativity, namely, Einstein's view on the distinction between principle and constructive theories, and the consequences of pursuing the constructive approach in the context of spacetime theories.

Discussion: The foundations of statistical mechanics—questions and answers (2005)
Amit Hagar
Philosophy of Science, 72 (3), 468-478

Huw Price (, , ) argues that causal‐dynamical theories that aim to explain thermodynamic asymmetry in time are misguided. He points out that in seeking a dynamical factor responsible for the general tendency of entropy to increase, these approaches fail to appreciate the true nature of the problem in the foundations of statistical mechanics (SM). I argue that it is Price who is guilty of misapprehension of the issue at stake. When properly understood, causal‐dynamical approaches in the foundations of SM offer a solution for a different problem; a problem that unfortunately receives no attention in Price’s celebrated work.

The primacy of geometry (2013)
Amit Hagar and Meir Hemmo
Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 44 (3), 357-364

We argue that in current spacetime physics there can be no dynamical derivation of primitive geometrical notions such as length. We illustrate our argument by focusing on two case studies: the special theory of relativity and some approaches to quantum gravity, and we argue that in these cases, some geometrical notions are assumed rather than derived. Our argument suggests a new reading of Einstein's views on the status of geometry vs. dynamics.

Quantum hypercomputability? (2006)
Amit Hagar and Alexandre Korolev
Minds and Machines, 16 (1), 87-93

A recent proposal to solve the halting problem with the quantum adiabatic algorithm is criticized and found wanting. Contrary to other physical hypercomputers, where one believes that a physical process “computes” a (recursive-theoretic) non-computable function simply because one believes the physical theory that presumably governs or describes such process, believing the theory (i.e., quantum mechanics) in the case of the quantum adiabatic “hypercomputer” is tantamount to acknowledging that the hypercomputer cannot perform its task.

Quantum algorithms: Philosophical lessons (2007)
Amit Hagar
Minds and Machines, 17 (2), 233-247

I discuss the philosophical implications that the rising new science of quantum computing may have on the philosophy of computer science. While quantum algorithms leave the notion of Turing-Computability intact, they may re-describe the abstract space of computational complexity theory hence militate against the autonomous character of some of the concepts and categories of computer science.

Kant and non-euclidean geometry (2008)
Amit Hagar
Kant-Studien, 99 (1), 80-98

It is occasionally claimed that the important work of philosophers, physicists, and mathematicians in the nineteenth and in the early twentieth centuries made Kant’s critical philosophy of geometry look somewhat unattractive. Indeed, from the wider perspective of the discovery of non-Euclidean geometries, the replacement of Newtonian physics with Einstein’s theories of relativity, and the rise of quantificational logic, Kant’s philosophy seems “quaint at best and silly at worst”. 1 While there is no doubt that Kant’s transcendental project involves his own conceptions of Newtonian physics, Euclidean geometry and Aristotelian logic, the issue at stake is whether the replacement of these conceptions collapses Kant’s philosophy into an unfortunate embarrassment. 2 Thus, in evaluating the debate over the contemporary relevance of Kant’s philosophical project one is faced with the following two questions:(1) Are there any …

Experimental Metaphysics: The double standard in the quantum-information approach to the foundations of quantum theory (2007)
Amit Hagar
Studies in History and Philosophy of Modern Physics, 38 (4), 906-919

Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one's system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge …

The complexity of noise: a philosophical outlook on quantum error correction (2010)
Amit Hagar
Synthesis Lectures on Quantum Computing, 2 (1), Jan-83

In quantum computing, where algorithms exist that can solve computational problems more efficiently than any known classical algorithms, the elimination of errors that result from external disturbances or from imperfect gates has become the "holy grail", and a worldwide quest for a large scale fault-tolerant, and computationally superior, quantum computer is currently taking place. Optimists rely on the premise that, under a certain threshold of errors, an arbitrary long fault-tolerant quantum computation can be achieved with only moderate (i.e., at most polynomial) overhead in computational cost. Pessimists, on the other hand, object that there are in principle (as opposed to merely technological) reasons why such machines are still inexistent, and that no matter what gadgets are used, large scale quantum computers will never be computationally superior to classical ones. Lacking a complete empirical …

Thomas Reid and non-Euclidean geometry (2002)
Amit Hagar

In the chapter “The Geometry of Visibles” in his ‘Inquiry into the Human Mind’, Thomas Reid constructs a special space, develops a special geometry for that space, and offers a natural model for this geometry. In doing so, Reid “discovers” non-Euclidean Geometry sixty years before the mathematicians. This paper examines this “discovery” and the philosophical motivations underlying it. By reviewing Reid’s ideas on visible space and confronting him with Kant and Berkeley, I hope, moreover, to resolve an alleged impasse in Reid’s philosophy concerning the contradictory characteristics of Reid’s tangible and visible space.

Edit your profile