Professor Markus Müller joined Western University in 2015 as an Assistant Professor in the Departments of Applied Mathematics and Philosophy. He also holds a Canada Research Chair in the Foundations of Physics at Western. In addition, Dr. Müller is an associate faculty member at the Perimeter Institute for Theoretical Physics in Waterloo. His work addresses topics in mathematical physics and philosophy, especially quantum computation and quantum information theory.

Amy Wuest: Quantum computation and quantum information are central themes in your research. Can you explain quantum information and its philosophical implications?

Markus Müller: Everybody has at least a vague idea what classical computation and information are: computers are the machines that operate on our office desks and in our smart phones; information is the stuff we are sending over telephone lines and storing on our hard drives. Information theory as a mathematical discipline started around the 1940’s, when Claude Shannon first proved his famous theorem, telling us by how much we can compress information before sending it. Since then, computer science has grown into a vast field, and this is obviously motivated by the technological applications.

However, there are two important twists to this, which relate information and computation to physics and philosophy. First, we have learned that “information is physical:” in several different fields of physics, the bits and bytes of computer science play a surprisingly important role for our understanding of what’s going on. The most famous examples of this can be found in thermodynamics. As Rolf Landauer has shown in the 1960’s, deleting one bit of information (for example, while erasing a hard drive) has an unavoidable energy cost. And this energy cost does not only place bounds on technology, but it can also help us to resolve subtle paradoxes that have appeared in the theory of thermodynamics.

More generally, I would argue that information theory is the unavoidable endpoint of trying to look at physics in a way that is as abstract and unbiased as possible. Intuitively, we tend to describe our world in terms of “objects” (like billiard balls or particles) that move around and collide. This way of viewing our universe is obviously hard-wired into our brains, since it works so well in our every-day world. But modern physics has severely challenged this picture. All that remains, in the end, is a picture where we can ask nature some question (by setting up an experiment, or by adjusting a telescope towards a distant star, for example) and see what nature answers. Removing all ontological excess baggage, physics is all about information, and information processing. This does not mean that our universe is literally a computer; but it means that the notions of information theory are ultimately the right tools to approach a better understanding of our world, since they allow us to set our human preconceptions about “how the world should be” aside.

There is a second twist, however. Our classical idea of information theory has also been challenged by modern physics. Quantum theory has shown us that our world is not governed by the two alternatives of a “bit” (yes or no, 0 or 1), but of a “quantum bit”, or “qubit”, which allows superpositions of two alternatives. Thus, taking the ideas above seriously (that we need to understand the world in, broadly, information-theoretic terms), we are forced to study quantum information theory.

AW: In what way is quantum information different from classical information?

MM: Quantum information behaves surprisingly different from classical information. It is often popularized by saying that “nobody has really understood quantum mechanics,” or that there is “spooky action at a distance.” But neither of those two statements is true.

There are at least two ways in which quantum information is different. First, it has been shown that computers that manipulate quantum information (that is, quantum computers) can solve some mathematical problems much faster than classical computers. This is one major reason why quantum information theory has become such an active field of research.

A second novelty of quantum information is that it allows forms of correlation that are classically impossible. For example, think of a pair of shoes. Put both shoes (the left and the right) into two identical black card boxes, and shuffle them around until you forget which is which. Mail one of the boxes to your father and the other one to your mother (assuming that they live far away). When they open their boxes, they will find that what they see is correlated: if your mother sees the left shoe, she knows immediately that your father sees the right shoe, or vice versa.

It turns out that if you put entangled quantum particles into boxes and mail them around, you can have this behavior too – but you can have even more general versions of it. That is, you can have forms of correlation that are simply impossible to get by sending around classical objects, like shoes. Classically, the decision about “which shoe is in which box” has been made in the very beginning, before sending the boxes – you may just have forgotten the result. In quantum theory, however, it is simply inconsistent to assume that the decision has been made at the moment when the particles have been put into the boxes. Somehow, the decision “comes into being” only later, when the two boxes are opened separately (and the particles are measured) at a distance. The mathematical proof of this statement is known as “Bell’s Theorem”.

AW: In your dissertation, you begin your research by exploring quantum Kolmogorov complexity, which “is an important measure of the information content of single binary strings,” as you explain. What is quantum Kolmogorov complexity and why is it important for understanding quantum computation?

MM: To understand quantum Kolmogorov complexity, we first need to have a look at its version in classical information theory.

The main idea of Kolmogorov complexity is to ask by how much a given message can be compressed. For example, think of your message as a long sequence of zeroes and ones (which is how pictures or texts are really stored on a computer hard drive). Suppose the message that you want to send to your friend over the telephone is the following:

1010101010101010101010… (total length of one million bits)

Then it would be quite stupid to literally read this message to your friend, just mumbling “one-zero-one-zero…” for several hours. Instead, you could just tell your friend that you have “five-hundred-thousand times the bits ‘10’.”

In fact, if you have a file on your hard drive that contains the message above, then there are tools (like “WinZIP”) that compress this message into something shorter, reducing the file size. This will save you time and money if you do it before sending the file. While the original file would consist of one million bits, the compressed file may only have 20 bits, depending on your method of compression.

The situation would be different if, for example, you tossed a coin a million times, and wrote down a “1” for “heads” and “0” for “tails”. This would give you something like the following:


There would be no obvious way to describe this irregular message in simple words. That is, there is in general no way to compress this sequence of bits into a shorter sequence.

Kolmogorov complexity formalizes these observations mathematically: for a given bit string x, the Kolmogorov complexity K(x) is the length (in bits) of its shortest possible description. For example, for x=101010… as above, the Kolmogorov complexity K(x) would be maybe close to 20, whereas for y=101100000… as generated by tossing a million coins, the complexity K(y) would be typically close to one million.

The philosophical, or conceptual value of Kolmogorov complexity is that it gives us an objective measure of the information content of some object (like a string of bits). This is related to the idea of “Ockham’s razor”, namely that simple hypotheses are to be preferred over complicated ones. If we want to use a concept in a rigorous way, we have to formulate the notion of “simplicity” mathematically. This is what Kolmogorov complexity does. “Quantum Kolmogorov complexity” is a generalization that counts quantum bits instead of classical bits.

AW: Even though this work involves many theoretical elements, you also discuss how your work might inform empirical investigations. Most often, these implications are in the fields of thermodynamics and the foundations of quantum mechanics. Can you explain how your work informs research in these two fields of study?

MM: Traditionally, thermodynamics (or statistical physics) is studied in cases where one has a very large number of particles. For example, think of a box that contains gas: the number of gas molecules is typically extremely large. This allows one to use simple statistical methods that become more and more reliable the larger the number of particles, since any kind of statistical flukes or exceptions will quickly average out.

On the other hand, both recent technological progress and fundamental curiosity should motivate us to ask what happens if the number of particles is very small – maybe we have only one particle, immersed in a heat bath. The best way to approach this problem is to set traditional statistical methods aside, and to use more fundamental information-theoretic approaches. This has led to the development of a new branch of statistical physics in the last few years, called “single-shot quantum thermodynamics.” One result, for example, that I have shown with a student and a colleague last year is that stochastic independence can act like a “fuel:” for very small systems, “uncorrelation” can be used to extract work. This is quite surprising, because it shows us that physics of very small systems can behave in the completely opposite way to its more traditionally studied “macroscopic” counterpart.

Information-theoretic thinking is also crucial for our understanding of the foundations of quantum mechanics. In 2011, my colleague Lluis Masanes and I published a paper that derived the abstract formalism of quantum theory from simple physical assumptions. Back then, it had been open for quite a while whether this is possible, and we were among a small group of researchers that first achieved this goal. To understand why this is important, simply open a standard textbook on quantum mechanics, and have a look at how the principles of quantum mechanics are formulated: the books will tell you, for example, that quantum states are vectors in a complex Hilbert space, and that physical quantities correspond to self-adjoint operators, but they will not tell you why this is the case. Not only is this conceptually unsatisfactory, but it has also many disadvantages for the every-day life of a physicist. For example, many physicists have asked whether there might be modifications of quantum mechanics somewhere out there in nature. However, the standard postulates of quantum theory do not leave any room to construct and analyze such modifications, because it is completely unclear what should be modified, why, and how.

Our result (and those of our colleagues) start with simple physical assumptions, and show that the standard abstract formalism of quantum theory is the only mathematically possible way to satisfy these assumptions. Crucially, these assumptions, or postulates, are information-theoretic in spirit. In other words, quantum theory can not only be understood and interpreted in information-theoretic terms, but the full formalism of quantum theory can be reconstructed from simple information-theoretic principles. This is a very suggestive result.

AW: The Rotman Institute aims to facilitate interactions between scientists and philosophers. Given your dual specializations in philosophy and applied math, how will you contribute to this mission?

MM: Right now, I think that my role at the Institute involves mainly two tasks: learning, and building bridges.“Learning” because my background is in theoretical physics and mathematics, not philosophy. Even though my main motivation has always been to work on the very fundamental and philosophical questions, I have never been part of a philosophy department before, and I do not have a degree in philosophy.

Therefore, there are so many things that I have to learn, and I am actually excited to learn. I am reading a lot of philosophical material at the moment (on the philosophy of physics in general, but also about metaphysics, where I am excited about notions like “ontic structural realism”), and I am trying to get more and more familiar with the intellectual culture of the Philosophy of Physics community. So many cultural habits are different from physics: for example, talks in physics are usually done with a projector, do not involve large parts of text, and have only brief discussions. Not so with talks in philosophy. I have just last week given my first “comment” to a philosophy talk. In some sense, I am entering a fascinating new world, which is a blessing as well as a challenge.

My second main task at the moment is to build bridges between the Rotman Institute and other departments and institutes. I am working to establish connections and collaborative exchange between physicists, philosophers and mathematicians. One of my first activities was to organize a joint workshop on Information Theoretic Interpretations of Quantum Mechanics, together with Michael Cuffaro, Lucas Dunlap, and Wayne Myrvold (see below). This workshop brings philosophers and physicists together, including several colleagues from the Perimeter Institute for Theoretical Physics in Waterloo, to talk about the foundations of quantum mechanics. (Let me emphasize that Lucas and Michael did most of the work, and they did a great job.) Another activity that I have just started to organize is a Foundations of Physics Working Group. This is a weekly informal meeting of physicists, philosophers and mathematicians at Western. We meet to discuss topics of common interest, have introductory talks that can be understood by people from other departments, and are exploring opportunities for collaboration.

AW: Along with Dr. Michael Cuffaro, Dr. Lucas Dunlap, and Professor Wayne Myrvold you have co-organized the workshop Information Theoretic Interpretations of Quantum Mechanics, which will take place this June. What is the goal of this workshop and what can attendees expect to learn?

MM: Detailed information is on our website, To give a quick impression of the event, I think attendees can expect an exciting mixture of physicists and philosophers, and heated discussions about “what it all means.” What quantum mechanics could mean is closely related to what I have said earlier: namely that quantum mechanics is best interpreted in information-theoretic terms. Broadly speaking, this is also the position of Jeffrey Bub, who has recently published a book that defends this thesis, which is a main inspiration for the workshop.

I think that participants can learn a great deal about our current understanding of quantum mechanics, and the predominant topics in the discussions of how to interpret it. But in addition to that, it will also be interesting to see physicists and philosophers argue. It is always fascinating to see different schools of thought, different kinds of prejudice and terminology, and also different personalities clash and confront, and nevertheless (or should I say consequently?) end up with fruitful insights.