Quantum Darwinism, classical reality, and the randomness of quantum jumps
OCT 01, 2014
The core principles that underlie quantum weirdness also explain why only selected quantum states survive monitoring by the environment and, as a result, why we experience our world as classical.
The superposition principle of quantum mechanics decrees that every combination of quantum states is a legal state. That is at odds with our experience; we never see anything like what is depicted in figure 1. So why can’t a chef prepare chicken à la Schrödinger?
Figure 1. A great moment in the development of the quantum microwave oven: chicken à la Schrödinger. (Illustration by Ben Bromley.)
The answer is that interactions with the environment help select preferred states of the system (see Physics Today, October 1991, page 36). As it is impossible to follow every variable of the composite system–environment whole, one relies on a reduced density matrix,
1
a statistical description of the system alone. It is obtained by averaging out the environment; Born’s rule,
2
which states that the probability of finding a system with wavefunction ∣ψ〉 in a specific state ∣k〉 is the absolute square of 〈k∣ψ〉, justifies that averaging.
The interactions that determine preferred states favor the cooked and alive chickens and banish alternate states with superpositions of cooked and alive. Those preferred states, left untouched by the interaction with the environment, are called pointer states. They eventually end up as the eigenstates of the reduced density matrix. The corresponding eigenvalues give probabilities—for example, of finding the chicken cooked or alive. The environment-driven process that selects pointer states is called decoherence, for a reason that will be clear soon.
In this article I study the emergence of the classical by tracing the origin of preferred pointer states and deducing their probabilities from core quantum postulates, a starting point more fundamental than decoherence theory, which relies on Born’s rule. I also explore the role of the decohering environment as a medium observers use to acquire information. The redundancy of information transferred from the system to many fragments of the environment leads to the perception of objective classical reality.
The quantum credo
The core quantum postulates I’ll need are a strikingly simple and natural part of a longer list of axioms found in many textbooks.
3
They underlie quantum weirdness, but they also help explain the emergence of the classical.
Much of the weirdness stems from the superposition principle implied by postulate 1: Quantum states correspond to vectors in a Hilbert space. Thus when ∣r〉 and ∣s〉 are legal quantum states, so is any ∣v〉 = α∣r〉 + β∣s〉. When ∣r〉 and ∣s〉 are orthogonal and normalized (that is, when 〈s∣r〉 = 0 and 〈r∣r〉 = 〈s∣s〉 = 1), then 〈v∣v〉 = |α|2 + |β|2.
Postulate 2 says that time evolution is unitary. According to the Schrödinger equation, a system prepared in a state ∣s0〉 will evolve, after a time t, to the state ∣st〉 = Ut∣s0〉, where the time evolution operator Ut is determined by the Hamiltonian H: Ut = e−iHt/ℏ. Unitary evolution preserves scalar products, 〈st∣rt〉 = 〈s0∣r0〉. It is also linear, so ∣vt〉 = αUt∣r〉 + βUt∣s〉.
What I call the composition postulate, postulate 0, deals with states of composite systems—for example, a system 𝒮 and its environment ℰ. It asserts that composite states can be expressed as superpositions such as ∑k,lγk,l∣sk〉∣εl〉, where ∣sk〉 and ∣εl〉 are bases in the system and environment Hilbert spaces. Entanglement enters via that composition postulate.
Postulates 0–2 guide calculations involving ingredients such as Hamiltonians. But such manipulations are just quantum math. To do physics, the math must be related to experiments.
The repeatability postulate, postulate 3, starts the task. It says that an immediately repeated measurement yields the same outcome. Classical repeatability is a given: Measurements reveal classical states, so repeatability follows from their objective existence. Observers cannot reveal an unknown quantum state, but repeatability lets them confirm the presence of known states. In fact, it’s hard to make the so-called quantum nondemolition measurements that abide by postulate 3. Yet repeatability is key for the very idea of a state as a predictive tool: The simplest prediction is that a state is what it is.
The core postulates 0–3 are my quantum credo. As we will see, they imply, or at least motivate, the troubling remainder of the textbook list.
The measurement amendments
The remaining textbook axioms involve measurement but, unlike the repeatability postulate 3, are controversial. Postulate 4, the collapse axiom, has two parts. According to part 4a, observables are Hermitian; as a consequence, only operators with orthogonal eigenstates are measurable. Axiom 4b says that the outcome of a measurement must correspond to an eigenstate of the measured Hermitian operator. A system in an arbitrary superposition of states will, when measured, collapse to an eigenstate of the measured observable.
Consensus is the hallmark of objective existence. An unknown classical state can be discovered by many and remain unchanged. By contrast, direct measurement of a quantum system resets its state to an item on the eigenstate menu. Thus the predictive power of quantum math is limited and consensus precluded. A pure quantum state doesn’t determine measurement outcomes with certainty; rather, it determines their probabilities via the final axiom, postulate 5: Born’s rule, pk = |〈k∣ψ〉|2, the key link between quantum math and physics.
The randomness inherent in postulates 4 and 5 clashes with the unitarity of postulate 2. The forefathers of quantum theory bypassed that conflict by insisting with Niels Bohr
4
that a part of the universe—including measuring devices and observers—must be classical. The selection of allowed measurement outcomes was determined by the classical apparatus, and the randomness of quantum jumps arose due to “the disturbance involved in the act of measurement” (page 36).
3
However, as I will discuss, the quantum credo leads to axioms 4a and 5 and even hints at 4b. And the perception of objective reality follows from the role of the environment as a communication channel that delivers information to us.
Repeatability and quantum jumps
Decoherence leads to environment-induced superselection of preferred states, and so accounts for effectively classical states and the menu of measurement outcomes.
5
A key tool used in practice—the reduced density matrix—arises from averaging over the environmental degrees of freedom in accord with Born’s rule. As I will show, though, environment-induced superselection and decoherence follow directly from the quantum credo; Born’s rule is not necessary.
Consider a measurement-like interaction of a system 𝒮 with a quantum apparatus 𝒜. The state of 𝒜 changes, but to ensure repeatability, the state of 𝒮 does not:
Here, the arrows represent the unitary evolution determined by the Hamiltonian H𝒮𝒜 describing the system–apparatus interaction. Inasmuch as ∣u〉 and ∣v〉 are untouched, a second apparatus with an analogous interaction will get the same outcomes.
Because the time evolution is unitary, the before and after scalar products of composite 𝒮𝒜 state vectors must be equal:
Equation (2), though simple, has profound consequences. To analyze them, start with a misstep: In an attempt to simplify, divide by 〈u∣v〉. The result is 〈A0∣A0〉 = 〈Au∣Av〉, or 〈Au∣Av〉 = 1. That unit value implies ∣Au〉 = ∣Av〉. In other words, the apparatus cannot distinguish ∣u〉 from ∣v〉.
Insisting on the repeatability postulate seems to have led to an absurd result. Have I just ruled out that part of the quantum credo as being incompatible with postulates 0–2? Not at all. Only when 〈u∣v〉 ≠ 0 can one simplify equation (2).
Instead, the above demonstration proves axiom 4a: Hermitian operators—that is, those corresponding to measurable observables—have orthogonal outcomes. Indeed, any 〈Au∣Av〉 ≠ 1 implies 〈u∣v〉 = 0, so the quality of the measurement record does not matter. Moreover, the persistence of recorded states sets the stage for quantum jumps accompanying any information transfer, including decoherence (in which case the environment plays the role of 𝒜): When only a discrete set of states in the Hilbert space is stable, the evolution of the system will look like a jump into one of the states.
Orthogonal states that survive multiple confirmations of their identity are selected by their interaction with the apparatus or decohering environment. Their superpositions could persist in isolation but cannot be recorded. Only discrete, stable states can be followed. Although there is no literal collapse of the wavefunction, measurement records will suggest a quantum jump from an initial superposition to one of the stable states or from stable state to stable state. According to decoherence theory, the ability to withstand scrutiny of the environment defines pointer states. As I will discuss, the proliferation of records about those states throughout the environment is the essence of quantum Darwinism.
In microsystems, repeatability is, in fact, rare: Nondemolition measurements are difficult. In the macroworld, however, repeatability is essential for the emergence of objective reality. Macrostates such as records inscribed in an apparatus should persist through many readouts, even as the underlying microstates change. A demonstration such as the one given above shows that stable macrostates must also be orthogonal to accommodate repeatability.
6
Born’s rule never entered into the above discussion. Scalar products of 0 and 1 signified orthogonality and equality; in-between values were not needed. Much of axiom 4 follows from the simple and natural core postulates 0–3.
Entanglement-assisted invariance
Decoherence is the loss of phase coherence between preferred states. It occurs when 𝒮 starts in a superposition of pointer states, but in the decoherence context, 𝒮 is “measured” by the environment ℰ:
The discussion centered around equation (2) implies that the unaltered states are orthogonal, 〈↑∣↓〉 = 0. Their superposition, upon interacting with the environment, turns into an entangled ∣ψ𝒮ℰ〉; neither 𝒮 nor ℰ retains an individual pure state.
Phases in a superposition matter. In a spin ½–like system, ∣→〉 = (∣↑〉 + ∣↓〉)/√2 is orthogonal to ∣←〉 = (∣↑〉 − ∣↓〉)/√2. The phase shift operator u𝒮φ = ∣↑〉〈↑∣ + eiφ∣↓〉〈↓∣ leaves ∣↑〉 untouched and multiplies ∣↓〉 by eiφ; when φ = π, it converts ∣→〉 to ∣←〉. In experiments, such phase shifts translate into shifts of interference patterns.
For simplicity, assume perfect decoherence, 〈ε↑∣ε↓〉 = 0. In that case, the environment has a perfect record of pointer states. What information survives decoherence, and what is lost? I now show that the phases of α and β no longer matter—that is, φ has no effect on the local state of 𝒮. Measurements on the system will not detect a phase shift, as there is no interference pattern to shift.
The key observation is that the phase shift u𝒮φ acting on an entangled ∣ψ𝒮ℰ〉 can be undone by uℰ−φ = ∣ε↑〉〈ε↑∣ + e−iφ∣ε↓〉〈ε↓∣, a countershift acting on a distant ℰ decoupled from the system:
As phases in ∣ψ𝒮ℰ〉 can be changed in a faraway environment decoupled from but entangled with the system, they can no longer influence the state of 𝒮. If they could, a measurement of 𝒮 would reveal that influence and enable superluminal communication.
The loss of phase coherence is decoherence. Superpositions decohere as the ∣↑〉 and ∣↓〉 states are recorded by ℰ. As phases no longer matter for 𝒮, phase information about 𝒮 is lost. As promised earlier, that information loss was established without reduced density matrices, the usual decoherence tool.
The above view of decoherence appeals to symmetry, an entanglement-assisted invariance, or envariance, of 𝒮 under phase shifts of pointer-state coefficients.
7
As 𝒮 entangles with ℰ, its local state becomes invariant under transformations that affected it before the entanglement.
In laboratory experiments, a system isolated from the environment is first measured by an apparatus 𝒜 so that system and apparatus entangle. That entangled state ∣ψ𝒮𝒜〉 then decoheres as 𝒜 interacts with ℰ:
The pointer states ∣A↑〉 and ∣A↓〉 of 𝒜, however, are unaffected by the decoherence interaction with ℰ. They retain perfect correlation with 𝒮 (or an observer, or other systems) in spite of ℰ, regardless of the value of 〈ε↑∣ε↓〉. Stability under decoherence is a prerequisite for effective classicality in our quantum universe: The familiar states of macroscopic objects have to survive monitoring by ℰ and retain correlations.
The decohered 𝒮𝒜 is described by a reduced density matrix obtained by averaging out the environment. When 〈ε↑∣ε↓〉 = 0, the pointer states of 𝒜 retain their correlations with the measurement outcomes:
Both ↑ and ↓ are present. There is no collapse.
The averaging over environmental states is implemented by a mathematical operation called taking a trace—that is, ρ𝒮𝒜 = Trℰ ∣Ψ𝒮𝒜ℰ〉〈Ψ𝒮𝒜ℰ∣. However, both the interpretation of ρ𝒮𝒜 as a statistical mixture of its eigenstates and the use of averaging via the trace operation rely on Born’s rule, axiom 5. To avoid circularity, I have avoided invoking that postulate earlier. Below, I will need it, but I am now in a position to derive it from the quantum credo using envariance.
Born’s rule from entanglement
Pierre Simon Laplace’s starting point for developing probability theory was the principle of indifference—that is, when nothing favors any one outcome, all outcomes are equally likely.
8
Thus the probability of blindly drawing a spade from a full deck of cards is ¼ because the deck has four suits, each with the same number of cards. Of course, that result doesn’t change if cards in the deck are swapped as illustrated in figure 2a, and that indifference to swap was regarded as a kind of symmetry. In the classical case, the symmetry is due to subjective ignorance: After all, if the cards were turned over as in figure 2b, it would be evident whether or not the to-be-drawn card is a spade. Classically, there is no objective, physical basis for the symmetry and, hence, for objectively equal probabilities.
Figure 2. Probability from entanglement. The swapping of hidden cards (a) highlights the subjective ignorance of a player, who regards the pre- and postswap states as equivalent. The player’s indifference suggests a “symmetry” that Pierre Simon Laplace used to define probability. (b) However, the physical state of the cards really does change after a swap: Subjective ignorance is a shaky foundation for a theory of probability. (c) For quantum systems, equal probabilities follow from entanglement. A swap in the system 𝒮 is undone by a counterswap in a measuring apparatus 𝒜. As a result (more fully discussed in the text), the swap in 𝒮 cannot alter any predictions—including probabilities—that depend on local states. (Illustrations by Fernando Cucchietti.)
In quantum physics, one seeks the probability of a measurement outcome starting from known initial states of 𝒮 and 𝒜 and the interaction H𝒮𝒜, and thus from the pure entangled state that results from the interaction; there is no room for subjective ignorance. Envariance, in a slightly different guise from when it accounted for decoherence, is an objective symmetry that leads to probabilities of mutually exclusive outcomes such as the orthogonal states deduced earlier from the repeatability postulate.
Suppose that 𝒮 starts as ∣→〉 = (∣↑〉 + ∣↓〉)/√2, so interaction with 𝒜 yields (∣↑〉∣A↑〉 + ∣↓〉∣A↓〉)/√2. I call such states—with equal absolute values of coefficients—even states. For such states, all measurement outcomes are equally probable, as I now show. Figure 2c illustrates the key step in the argument.
The unitary swap ∣↑〉〈↓∣ + ∣↓〉〈↑∣ exchanges the states in 𝒮:
Before the swap, ∣↓〉 was as probable as ∣A↓〉, and ∣↑〉 was as probable as ∣A↑〉. After the swap, ∣↓〉 is as probable as ∣A↑〉, and ∣↑〉 is as probable as ∣A↓〉. But probabilities in 𝒜 are unchanged, as 𝒜 is untouched by the swap, so the probabilities p↑ and p↓ in 𝒮 must have been exchanged.
To prove equiprobability, we now swap records in 𝒜:
That swap restores the original preswap state. Hence all predictions about 𝒮, including probabilities, must be as they were in the original state. Evidently, the probabilities of ∣↑〉 and ∣↓〉 (and of ∣A↑〉 and ∣A↓〉 for that matter) are exchanged yet unchanged. Therefore, they must be equal to ½. For N envariantly equivalent alternatives, it is straightforward to show that the probabilities are all 1/N. The discussion of envariance in the decoherence context implies that those probabilities are unchanged when the coefficients of the alternatives are multiplied by arbitrary phases.
Instead of subjective ignorance à la Laplace, I invoked an objective symmetry of entanglement, a quantum ingredient absent in Laplace’s classical setting. As with the uncertainty principle (knowing position precludes knowing momentum), the indeterminacy of outcomes was a consequence of knowing something else—the whole entangled state. The objective indeterminacy of 𝒮 or 𝒜 and the equiprobability of ∣↑〉 and ∣↓〉 follow.
For an uneven ∣ϕ𝒮𝒜〉 = α∣↑〉∣A↑〉 + β∣↓〉∣A↓〉, swaps on 𝒮 and 𝒜 yield β∣↑〉∣A↑〉 + α∣↓〉∣A↓〉. That’s not the preswap state, and, indeed, p↑ and p↓ are not equal. To see how Born’s rule arises for the uneven case, turn to the box on page 47.
Information interlude
Decoherence builds on John von Neumann’s analysis of measurement
1
but begins to recognize the role of the environment. Its usual implementation, however, relies on Born’s rule, axiom 5, to justify the physical significance of reduced density matrices. We now have a simple yet fundamental demonstration of Born’s rule. The next goal is to understand the emergence of objective classical reality in our quantum universe. As I will discuss below, environments do more than decohere; they act as communication channels through which we obtain our information.
Pointer states preserve correlations, in particular between a system and a measuring apparatus. The one-to-one correspondence of states of 𝒮 and 𝒜, which is evident in equations (5) and (6), does not rely on Born’s rule. However, quantifying the information 𝒜 has about 𝒮 relies on the interpretation of the reduced density matrices as statistical mixtures of their eigenstates with probabilities (in the case of equation (6)) given by p↑ = p𝒜↑ = ∣α∣2, p↓ = p𝒜↓ = ∣β∣2. Now that Born’s rule has been justified, the reduced density matrix may be used with confidence to calculate the entropy and information needed to study what I call quantum Darwinism.
The entropies of 𝒮, 𝒜, and the composite 𝒮𝒜 are given by the von Neumann expression H(ρ) = −Tr(ρlnρ). For the reduced density matrix of equation (6), all three entropies are, in fact, equal:
That equality means 𝒮 and 𝒜 know each other’s preferred states perfectly. It’s as if one had two identical copies of the same book; each individual copy would reveal the information content of the two books. How much two systems know about each other is quantified by the so-called mutual information
9
When 𝒮 and 𝒜 are totally uncorrelated, ρ𝒮𝒜 = ρ𝒮ρ𝒜, H𝒮𝒜 = H𝒮 + H𝒜, and I(𝒮 : 𝒜) = 0. For the perfectly correlated case corresponding to equation (6), I(𝒮 : 𝒜) = H𝒮 = H𝒜.
In a classical world, I(𝒮 : 𝒜) ≤ min(H𝒮, H𝒜). After all, the information common to two books cannot exceed the content of the smaller book. Thus the decohered reduced density matrix of equation (6) saturates the classical limit.
Quantum correlations can be stronger. Entanglement correlates every basis—for example, (∣↑↑〉 + ∣↓↓〉)/√2 = (∣→→〉 + ∣←←〉)/√2. Decoherence that favors the pointer states ∣↑〉 and ∣↓〉 yields ρ = (∣↑↑〉〈↑↑∣ + ∣↓↓〉〈↓↓∣)/2. Pointer states remain correlated, but the ∣→〉 and ∣←〉 states do not. The mutual information reflects that state of affairs: For a pure, entangled 𝒮𝒜 whole, α∣↑〉∣A↑〉 + β∣↓〉∣A↓〉, H𝒮𝒜 = 0, whereas H𝒮 = H𝒜 = −∣α∣2 ln ∣α∣2 − ∣β∣2 ln ∣β∣2, so I(𝒮 : 𝒜) = 2H𝒮.
Quantum Darwinism studies the role of the information about the system that proliferates and spreads throughout the environment in the emergence of the classical. Mutual information is its essential tool. When I(𝒮 : 𝒜) = H𝒮, an apparatus can fully reveal the state of 𝒮. In quantum Darwinism, a fragment ℱ of the environment plays the role of 𝒜. Its correlation with 𝒮 will often be effectively classical, as the rest of the environment (denoted ℰ\ℱ) assures decoherence.
Quantum Darwinism
We all monitor our world indirectly, eavesdropping on the environment. For instance, you are now intercepting a fraction of the photons scattered from this page. Anyone intercepting other fractions will see the same images. Quantum Darwinism recognizes that environments consist of many subsystems, as illustrated in figure 3, and that observers acquire information about a system by intercepting copies of its pointer states deposited in fragments of the environment.
Figure 3. Environmental fragments as witnesses. (a) The decoherence paradigm distinguishes between a system 𝒮 and its environment ℰ. (b) The environment, in turn, may be viewed as a collection of many subsystems ℰi. (c) Quantum Darwinism recognizes that subsystems combined into environmental fragments ℱj can act as measuring devices that store information about the system.
The environment-induced superselection associated with decoherence has already hinted at survival of the fittest: Environments select pointer states that survive and can aspire to classicality. Quantum Darwinism goes beyond mere survival to address proliferation—how, during the course of decoherence, copies of pointer states of 𝒮 or 𝒜 get imprinted on ℰ.
10
For an environment comprising many subsystems (formally, an environment expressible as a tensor product of subsystem Hilbert spaces), the initial state (α∣↑〉 + β∣↓〉)∣ε0(1)ε0(2)ε0(3) …〉 evolves into
The state ∣Υ𝒮ℰ〉 represents many records inscribed in environmental fragments. As a consequence, the state of 𝒮 can be found out by many observers— independently and without disturbing 𝒮. That redundancy is how evidence of objective existence arises in our quantum world.
An environment fragment ℱ acts as an apparatus with a possibly incomplete record of 𝒮. When ℰ\ℱ is traced out, 𝒮ℱ decoheres, and the reduced density matrix describing the joint state of 𝒮 and ℱ is
in close analogy with equation (6). When 〈ℱ↑∣ℱ↓〉 = 0, ℱ contains a perfect record of the preferred states of the system.
The number of copies of the data in ℰ about pointer states is the measure of objectivity; it determines how many times information about 𝒮 can be extracted from ℰ. The central question of quantum Darwinism is thus, What fraction of ℰ does one need to sample if the goal is to find out about 𝒮? Mutual information provides the answer. Let #ℰ denote the number of subsystems and #ℱ be the number of subsystems in a fragment ℱf that makes up a fraction f = #ℱ/#ℰ of ℰ. Then I(𝒮 : ℱf) = H𝒮 + Hℱf − H𝒮ℱf is the information about 𝒮 available from ℱf.
In principle, each individual subsystem might be enough to reveal the state of 𝒮. In that case, I(𝒮 : ℱf) would jump to H𝒮 at f = 1/#ℰ. Usually, however, larger fragments of ℰ are needed to find out enough about 𝒮. The red curve in figure 4 shows how, after an initial sharp rise, I(𝒮 : ℱf) only gradually approaches the classical plateau at H𝒮. As illustrated in the figure, the initial rise is completed at a fraction fδ, defined with the help of the information deficit δ observers tolerate:
Figure 4. Information about a system contained in a fraction f of the environment. The red curve shows a typical result for the mutual information gained via decoherence. (The text gives a precise definition.) Its rapid rise means that a large fraction (1 − δ) of classically accessible information can be revealed by a small fraction (fδ) of the environment. The long classical plateau signifies that additional environmental fragments merely confirm what was already known. The rather different green curve shows the information in the environment for a randomly selected pure state in the system–environment composite.
The inverse of fδ is the number of records in the environment—the redundancy, Rδ. It sets the upper limit on how many observers can find out the state of 𝒮 independently and indirectly. In several models that have been studied
10
(and in particular, for the photon-scattering model of decoherence
11
), Rδ is huge
12
and varies weakly (that is, logarithmically) with δ.
Decoherence can, under the right conditions, lead to “quantum spam” as Rδ imprints of pointer states are broadcast through the environment. Many observers can independently access those imprints, which ensures the objectivity of pointer states of 𝒮.
Repeatability is key. Collectively, the environmental fragments act like the apparatuses posited in connection with equations (1) and (2); they register multiple records of pointer states of 𝒮 without altering them. The no-cloning theorem restricts the ability to make copies, but copying is possible when the states to be copied are all orthogonal (see Physics Today, February 2009, page 76).
Repeatability thus begets discreteness. The time evolution responsible for decoherence yields a superposition of distinct branches, each with a stable state and many environmental imprints, per equation (10). So there is no literal collapse. However, as a result of decoherence by ℰ\ℱ, an observer monitoring the records imprinted on fragments of ℰ will see only one branch, not a superposition of branches. Such evidence will suggest a quantum jump from a superposition of states to a single outcome (or, under appropriate circumstances, from state to state), in accord with postulate 4b.
Environment as witness
Quantum Darwinism shows why it is so hard to undo decoherence. As illustrated in figure 4, a plot of mutual information for an initially pure 𝒮 and ℰ is antisymmetric about f = ½ and H𝒮.
10
Hence, a counterpoint of the initial quick rise of the red curve at f ≤ fδ is a quick rise at f ≥ 1 − fδ as the last few subsystems of ℰ are included in the fragment ℱ that by now contains nearly all ℰ. Such a rise must occur in an isolated 𝒮ℰ, because an initially pure 𝒮ℰ remains pure under unitary evolution.
For the system–environment whole, H𝒮ℰ = 0, so I(𝒮 : ℱf)∣f = 1 must reach 2H𝒮. Thus a measurement of all of 𝒮ℰ could confirm a state’s purity despite the decoherence caused by ℰ\ℱ for all f ≤ 1 − fδ. (In principle, a measurement of ℰ alone reveals the state; the measurement of 𝒮 confirms that revelation.) However, such a confirmation would require intercepting and measuring all of 𝒮ℰ in a way that reveals the pure state without perturbing it. So undoing decoherence is possible in principle, but the required resources and foresight preclude it.
In quantum Darwinism, the decohering environment acts as an amplifier, inducing a branch structure that is distinct from randomly selected states in the Hilbert space of 𝒮ℰ. For those generic states, as the green plot in figure 4 shows, the mutual information has no plateau and so the environment registers no redundancy.
13
The plot is still antisymmetric: I(𝒮 : ℱf) jumps at f = ½ to nearly 2H𝒮.
Not all environments are good witnesses. However, photons excel: They do not interact with air or with each other, and so they faithfully pass on information. A small fraction of a photon environment usually reveals all an observer needs to know. The scattering of sunlight quickly builds up redundancy. For example, when photons scatter off a 1-µm-diameter dielectric sphere in a superposition of states 1 µm apart, Rδ = 0.1 increases by about 108 every microsecond.
12
Air is also good in decohering, but its molecules interact and scramble acquired data. Objects of interest scatter air and photons, so both environments acquire information about position and favor similar localized pointer states.
Environments, like air, that decohere 𝒮 but scramble information because of interactions between subsystems eventually lead to a random state in 𝒮ℰ. Quantum Darwinism is possible only when information about 𝒮 is preserved in fragments of ℰ and so can be recovered by observers. Absolute perfection is not necessary. Partially mixed environments or imperfect measurements correspond to noisy communication channels that, despite their depleted capacity, can still deliver the message.
14
Information and objective reality
John Wheeler, Charles Bennett, and others have previously considered the relation between information and existence.
15
Quantum Darwinism adheres to the quantum credo and adds to that discussion by recognizing that a decohering environment can be a communication channel. But since observers intercept only fractions of ℰ, information about 𝒮 is only accessible when it is redundantly imprinted on ℰ. Put another way, an observer can get information only about pointer states that remain intact despite monitoring by ℰ: Using the environment as a communication channel comes at the price of censorship. Fractions of ℰ reveal branches one at a time and suggest quantum jumps.
The basic tenets of decoherence have been confirmed by experiment,
16
and it may also be possible to test quantum Darwinism; envariance is already being tested.
17
The list of textbook axioms has now been reduced, as the Hermitian nature of observables and Born’s rule follow from the quantum credo. Accounting for collapse goes beyond mathematics, as it involves perception. That is where quantum physics gets personal. Nevertheless, the indirect monitoring of quantum systems recognized by quantum Darwinism implies that after their first glimpse of data in ℰ, observers will get only confirmations and updates. So the first glimpse eliminates surprise—collapses it, if you will. Thereafter, as was the case in the classical world we once thought we inhabited, pointer states persist objectively, untouched by our curiosity and oblivious to our indirect monitoring.
Box. Born’s rule for uneven states
The main text considered superpositions of states involving a system 𝒮 and measuring apparatus 𝒜 and showed how the well-known Born probability rule follows for the specific case of superpositions whose coefficients have equal absolute values. Here I show how that special case leads to Born’s rule for states ∣ϕ𝒮𝒜〉 = α∣↑〉∣A↑〉 + β∣↓〉∣A↓〉 in which the coefficients are not equal in magnitude.
First, let ∣α∣2/∣β∣2 = μ/ν, where μ and ν are natural numbers. The key trick is to fine-grain—that is, to change the basis in the Hilbert space of 𝒜 so that ∣A↑〉 = ∑ μk = 1∣ak〉/√μ, and ∣A↓〉 = ∑ μ + νk = μ + 1∣ak〉 / √ν. Expressed in terms of that new basis,
Next, simplify to get rid of the fractions, and imagine an environment that decoheres 𝒜 in the new basis, so that the ∣ak〉 correlate with ∣ek〉 as if the ∣ak〉 were the preferred pointer states:
Now swaps of ∣↑ak〉 with ∣↓ak〉 can be undone by counterswaps of ∣ek〉, and thus all μ + ν alternatives are equally probable. Since μ of those correspond to measurements of ↑, Born’s rule follows:
Continuity establishes the result for cases in which ∣α∣2 and ∣β∣2 are not related by rational numbers. The frequencies of detection of ↑ and ↓ can be predicted by extending the derivation to the case of many measurements.
7
I thank Charles Bennett, Robin Blume-Kohout, Jim Hartle, Raymond Laflamme, Juan Pablo Paz, Hai-Tao Quan, Jess Riedel, Wolfgang Schleich, Max Tegmark, and Michael Zwolak for enjoyable and helpful discussions, and with appreciation I acknowledge support from the US Department of Energy and the Foundational Questions Institute.
References
1. J. von Neumann, Mathematical Foundations of Quantum Mechanics, R. T. Beyer, trans., Princeton U. Press (1955).
15. See, for example, J. A. Wheeler, in Complexity, Entropy, and the Physics of Information, W. H. Zurek, ed., Addison-Wesley (1990), p. 3; C. H. Bennett, in Quantum Computing: Back Action 2006, D. Goswami, ed., AIP Conf. Proc. Vol. 864, AIP (2006), p. 11.
Wojciech Zurek is a laboratory fellow at Los Alamos National Laboratory in Los Alamos, New Mexico, and an Albert Einstein Professor at Ulm University in Ulm, Germany. He is writing a book, tentatively titled Quantum Darwinism and Decoherence, that analyzes issues briefly considered in this article.
Although motivated by the fundamental exploration of the weirdness of the quantum world, the prizewinning experiments have led to a promising branch of quantum computing technology.
As conventional lithium-ion battery technology approaches its theoretical limits, researchers are studying alternative architectures with solid electrolytes.
Bottom-up self-assembly is a powerful approach to engineering at small scales. Special strategies are needed to formulate components that assemble into predetermined shapes with precise sizes.
The polymath scientist leaves behind a monumental legacy in both the scientific and political realms.
November 04, 2025 09:53 AM
This Content Appeared In
Volume 67, Number 10
Get PT in your inbox
Physics Today - The Week in Physics
The Week in Physics" is likely a reference to the regular updates or summaries of new physics research, such as those found in publications like Physics Today from AIP Publishing or on news aggregators like Phys.org.