Archive for March, 2011

The Elusive Object

29 March 2011

Behind the curtain

The Reformed Realist
Some of Bernard d’Espagnat’s best and dearest friends might be realists.

Chapter nine of his On Physics and Philosophy, entitled “Various Realist Attempts,” describes with a perceptible tinge of sorrow how the conventional realist’s goal seems doomed to failure.

If not certainly doomed, they are at least misguided, he feels, no matter how much he sympathizes with the impulse to believe in a knowable physical reality beyond the appearances.

These attempts have some difficult hurdles to jump. A successful theory should—

  1. Make the same (or almost the same) predictions as conventional quantum mechanics
  2. Respect the results of Aspect-type experiments and the Bell Theorem
  3. Show that the interpretation is more than just a calculating convenience
  4. Be more than just a reassuring linguistic reconfiguration, and
  5. Keep its conceptual building blocks pretty faithful to its roots in realism.

The last criterion isn’t absolutely necessary, but if the only way a realist theory can work is by defining common terms (such as particles) in curiously non-realist ways then the project seems a bit dubious.

Add to that the requirement to respect the Bell Theorem and (more or less) match conventional quantum theory’s predictions, which mandate nonlocality if you want physical realism, and these efforts look increasingly futile.

In greater detail…

D’Espagnat’s Realism vs Near Realism
D’Espagnat says he very much sympathizes with realists, and says his own views don’t depart too radically from theirs. His disagreement, he says, developed not on a priori grounds but after he pondered the evidence of physics.

Proof vs Sentiment
Physical realism is an unprovable metaphysical stance, one among many. But “nobody” believes the moon disappears when we don’t look at it, says d’Espagnat. Commonsense arguments even convinced Einstein.

Giving Up Physical Realism vs Locality
John Bell (of Bell’s Theorem fame) continued to believe in a physical reality even after his theorem and experimental data shook the foundations of physical realism.

He could have given up the idea of a physical reality knowable in principle, but instead he chose to believe this reality is nonlocal.

Description vs Synthesis
D’Espagnat makes up “Jack,” a physicist who’s a hardline physical realist. Jack believes science has succeeded magnificently on so many levels. Theories aren’t just some synthesis of observations. They are more-or-less accurate descriptions of reality (as d’Espagnat calls it, “reality-per-se”).

Senses vs Reality
Philosophers like Hume would counter that our knowledge of reality depends on our senses, yet we have no guarantee our sensations correspond with reality. Jack might call this argument overly broad as it applies to any piece of knowledge, including our ordinary experiences that we could hardly doubt.

Words vs Reality
The sceptic might then say that the results of experiments are communicated by words, but how do we know these words correspond to the building blocks of reality? Again Jack points to everyday experience and the concepts we seem to know instinctively works: objects, their positions, their motions, and so on.

The hardline realist says an experiment described using these simple concepts surely must say something true about physical reality.

Strong vs Weak Objectivity
Jack the hardline realist might then lament all those physicists who claim to be realists but use standard quantum mechanics. Don’t they realize this theory is only “weakly objective”? In other words, it describes observations but doesn’t claim to describe reality itself.

Standard vs Broglie-Bohm Interpretations
D’Espagnat says Jack would be further perplexed because the Broglie-Bohm interpretation offers predictions identical to the standard interpretation (in the non-relativistic domain) and claims to be an explanation. It doesn’t just predict observations.

It also may offer a (partial) way out of the “and-or” problem with mixed quantum states. We’d like to show why the pointer dial doesn’t indicate multiple values at the same time.

Standard vs Broglie-Bohm Predictions
D’Espagnat notes that Broglie-Bohm’s predictions match the standard model’s. The good news is that Broglie-Bohm’s predictions aren’t wrong. The bad news is the standard model uses simpler mathematics and predicts so much more.

Superficial Realism vs Nonlocal Results
Though not a critical deficiency, it’s definitely odd that Broglie-Bohm starts off with concepts intuitively familiar to us such as corpuscles and trajectories but ends up predicting a nonlocal reality.

This doesn’t mean the theory is wrong, but it does mean the realist’s agenda is somewhat frustrated.

Real vs Abstract Particles
Broglie-Bohm replaces boson particles with abstract quantities (fields or their Fourier components). Photons are only “appearances,” somewhat undermining the realist model. The jury’s still out on how to deal with fermions.

Measured vs Secret Properties
Broglie-Bohm says momentum is really the product of mass and velocity even if quantum measurements show something else (see chapter seven). Also in this model detectors are sometimes “fooled,” acting as if a particle hit them even when it didn’t.

Finally, a “quantum potential,” which doesn’t vary by distance, means “free” particles don’t really travel in straight lines.

So some aspects of reality remain experimentally out of reach, yielding only illusions, an odd position for a realist model to take.

Realism vs Observer Choices
Consider two entangled particles, one going left and one going right. The Broglie-Bohm model says in some set-ups you’ll consistently get the same result if you measure the left-moving particle first, and a different result if you measure the right-moving particle first. Since the particles are entangled, the first one you measure matches the result of the other one you measure.

The problem is that this doesn’t sound like it describes the world “as it really is” but rather just our observations. Our choices as observers seem to affect what’s “really” going on. This does not fit in very well with the realist agenda.

Relativity vs Observer Choices
It gets worse. Depending on who’s checking, the “time order” of these measurements may differ if they’re “spatially separated” (that’s when you’d have to travel faster than the speed of light to get from one measurement to the other). Since the instruments are showing the same result to any observer, are they simultaneously telling the truth and lying?

It appears you can choose a privileged space-time frame that somehow still matches the predictions of special relativity but is consistent with Broglie-Bohm too, but again we end up with all these illusory appearances and an explanation that can’t be verified (or at least distinguished from competing theories).

Bohm #1 vs Bohm #2
D’Espagnat (in a footnote) says difficulties with the Broglie-Bohm model led David Bohm to devise his “implicit order” theory, which does not rely on corpuscles. The problem is that the “implicit” order of what’s really happening is separated from the “explicit” order of appearances, and it’s hard to turn that distinction into an “ontologically interpretable” theory.

Standard vs Modal Interpretations
Borrowing modal logic’s use of intrinsic probabilities, Bas van Fraassen initiated a different approach to realist quantum mechanics that led to various related interpretations.

Wave Function vs Finer States
Standard quantum mechanics says the wave function is the best description of a quantum system. “Modal” interpretations say sometimes there are “finer” states governed by hidden variables (d’Espagnat prefers to call them “supplementary”).

Standard vs Intrinsic Probabilities
In “modal” interpretations the wave function describes the probability of various measurements but not necessarily what is “really” happening. The use of supplementary variables rescues these interpretations from the problem of proper mixtures and ensembles (see chapter eight). A system is in state A or state B even before a measurement, even if the quantum state is A + B.

Wave Function vs Value State
A system’s wave function describes observational probabilities. In a “modal” interpretation the system’s “value state” uses supplementary variables to describe what’s “really” happening.

Broglie-Bohm vs “Modal” Interpretations
“Modal” interpretations are indeterminate and Broglie-Bohm is determinate, but they share the need for supplementary variables that are experimentally undetectable–and they produce predictions identical to the standard interpretation’s.

These realist approaches also seem to violate special relativity. Since their predictions are consistent with the standard interpretation’s they end up being nonlocal, which special relativity isn’t really equipped to handle.

Also, in some cases (say some authors) the “modal” interpretation implies the measurement dial will somehow show a value different from the predicted “observed” value. It’s as convoluted as the measurement issues in Broglie-Bohm (such as detectors’ getting false hits).

Unlike Broglie-Bohm the “modal” interpretations also get into difficulties about properties of a system and its subsystems. A subsystem can have a property even if the system itself doesn’t.

Language vs Ontology
D’Espagnat wonders if the “modal” interpretations are basically just offering a different language convention. The terms make it sound like something is “really” going on, but this alleged reality is inaccessible to observers, and “modal” interpretations make the same predictions as the standard interpretation of quantum mechanics.

Schrödinger vs Heisenberg Representations
Yet another approach makes use of the Heisenberg representation. Its equations are supposedly more realism-friendly than Schrödinger’s wave function.

Time-dependent vs Time-independent Equations
In both representations dynamical quantities (position and velocity, for instance) are represented by “self-adjoint operators.”

The Schrödinger wave function is time independent until a measurement is made. The wave function does double duty, describing states then knowledge.

The Heisenberg representation does things differently. Its self-adjoint operators are time dependent–so maybe they describe “real” states that are evolving through time.

Heisenberg Representation vs Contingent States
The problem is that the self-adjoint operators in the Heisenberg representation, though designating dynamical quantities, refer to all possible values of those quantities. You have to specify initial values if you want the measurement to be a “mental registration” rather than a “creation” of those values.

Just as bad, the best way to specify those initial conditions is by using the wave function.

Heisenberg vs Schrödinger Operators
D’Espagnat says that in the end the self-adjoint operator has too modest a scope in the Heisenberg representation. It does not label contingent states.

In the Schrödinger representation there’s the opposite problem. The self-adjoint operator’s role there is too ambitious. It labels the initial state as it “really” is, which leads to the problems of the measurement collapse.

Feynman’s Reformulation vs Physical Realism
D’Espagnat says high-energy physicists mostly see physical realism as self-evident. Richard Feynman’s “fabricated ontology” greatly eases their calculations, and apparently eases many philosophical doubts too.

Probabilities with Detectors vs without Detectors
In standard quantum mechanics the probability amplitude indicates how likely one would find a particle (for instance) at a particular spot if there were a detector there.

Feynman’s leap was to interpret it as how likely a particle would “arrive” at a certain point–whether or not there was a detector there.

Being vs Calculating
So is this “arrival” (which means that it “is,” however briefly, at that point) an ontological claim or is it just a calculating convenience? D’Espagnat says Feynman knew quite well the problems of interpreting quantum mechanics but was “absolutely reluctant” to talk about them.

Since fringes in a double-slit experiment show up, clearly this way of speaking is just for predictive purposes. If a particle “really arrived” at one slit or the other there’d be no fringes on the detector screen. In fact, the older quantum field theory and the Feynman diagram approaches “are quite strictly equivalent.”

This means they both support the nonlocality hypothesis.

Standard vs Non-Boolean Logic
Quantum mechanics’ formalism uses Hilbert space. This infinite-dimensional abstract space leads some to suggest a non-Boolean logic would rescue objectivist realism.

Formalism vs Experimental Facts
However, d’Espagnat says that this reformulation has no more ontological significance than Feynman’s approach. Nonseparability and nonlocality remain as issues since these are experimental facts not dependent on the formalism. Using a kind of quantum logic can’t on its own describe microsystems in realist terms.

Standard vs Partial Logics
Griffiths, Gell-Mann and Hartle, and Omnès have tried using “partial logics” and “decohering histories.” D’Espagnat says that this approach (like the non-Boolean approach) reformulates quantum mechanics but doesn’t change its predictions. The experimental facts remain a barrier to objectivist realism.

Macroscopic Reality vs Microscopic Unreality
Because of experimental results (such as Aspect’s combined with the Bell inequalities) it’s clear that the microscopic arena is not going to yield to some “strongly objective” form of realism. The challenge then becomes figuring out how “real” macroscopic entities could possibly be made up of “unreal” microscopic constitutents.

Existence vs Meaning
One approach is to deflect the question. Decoherence describes a mechanism by which macroscopic objects have a certain (physical-looking) appearance—but not existence as such. Maybe we can create Dummett-like criteria (see chapter seven) for determining just the meaning (“signification”) of statements about macroreality (but not microreality).

Entities vs Observability
If you’re going to make meaningful statements about macroscopic reality then it would help if you could define macroscopic entities. This is surprisingly difficult. One attempt uses statistical mechanics’ concept of “irreversibility” because human observational skills are limited.

D’Espagnat says this approach doesn’t necessarily sit well with a realist. After all, the general goal of realist approaches is to describe reality (to some degree of accuracy) through our own observations.

Schrödinger’s Cat vs Laplace’s Demon
Decoherence theory says that our inability to make precise measurements of complex systems creates the illusion of macroscopic reality. So what do we do about this limitation? We could imagine some version of Laplace’s demon who’s able to make precise measurements of all physical quantities in the universe.

We could then try to determine if he sees Schrödinger’s cat as simultaneously dead or alive—or just one or the other, as humans do because of their limited observational acuity. This would tell us what’s “really” going on.

But how powerful should this demon be? Let’s assume he can’t use an instrument made up of more atoms than the universe possesses. Some physicists then calculate that even Laplace’s demon couldn’t observe the complex quantum superpositions theoretically observable in macroscopic objects.

The “meaningful” conclusion is that these complex quantities are “nonexistent” and therefore the Schrödinger cat problem disappears.

Realism vs Human Decisions
But can a supposed reality depend on the capabilities of an observer (human or otherwise)? Even more fundamentally, mathematical representations of quantum ensembles (see chapter eight) are compatible with an infinite number of physical representations. Why is just one representation chosen?

In the end it seems this kind of realist argument ends up describing an empirical reality, not a meaningful approximation of an observer-independent reality.

Linear vs Nonlinear Terms
You can trace the “conceptual difficulties” of quantum mechanics back to the mathematical linearity of the formalism. Unsurprisingly, some realists might consider adding terms to make the mathematics nonlinear.

These new terms have almost no effect on observational predictions but allow a profound conceptual leap when it comes to macroscopic objects. Their centre-of-mass wave function will now collapse frequently and spontaneously, so there’s no more “measurement collapse.”

Relativity vs Nonlinear Realism
Nonlocality is still an issue, even though we’re talking about faster-than-light “influences” instead of signalling. The realist might retort that standard quantum mechanics runs into the same problem, but d’Espagnat says it’s the demand for realism that prevents relativity and quantum mechanics from being compatible.

Decoherence vs Nonlinear Realism
Decoherence theory and approaches based on nonlinear terms are making essentially identical predictions. However, decoherence theory says macroscopic objects are just phenomena. We share this knowledge and call it “empirical reality.” Nonlinear realism believes these objects are “real.”

D’Espagnat wonders why we even need nonlinear terms considering that according to conventional (that is, linear) quantum mechanics any macroscopic object with quantum features quickly goes through decoherence and ends up showing classical features.

Appearance vs Reality
So you don’t need nonlinear terms unless you want macroscopic objects not just to “appear” the way they do but also “really” to be like that.

Verbalism vs Reality
D’Espagnat is unimpressed by these ontological manoeuvres. He rhetorically asks if this is “some kind of a poor man’s metaphysics” amounting to little more than “pure verbalism.”

Open Realism vs Commonsense Realism
Yet D’Espagnat is not prepared to abandon realism altogether. He believes in a “veiled reality” that can be gently prodded through an approach he calls “open realism.”

But for realism to be consistent with the results of quantum experiments the reality that’s allowed is far different from the “commonsense” reality of the man in the street, or even that of many hard-nosed physicists.

Measuring the Decoherence

4 March 2011

Realistically Speaking
Chapter eight of Bernard d’Espagnat’s On Physics and Philosophy is entitled, “Measurement and Decoherence, Universality Revisited.”

In some ways it was a very dense and difficult chapter to read (and summarize). However, in the end the main points seemed pretty reasonably clear:

  1. Quantum universalism and our perceptions of macroscopic reality at first appear to clash
  2. A macroscopic object easily shifts between numerous and narrow energy bands under the slightest influence from their environment
  3. Therefore it’s almost impossible to measure the exact quantum states of macroscopic objects
  4. Our lack of knowledge about large-scale systems in “decoherent” states leads to the apparent stability of the macroscopic world
  5. However, on the microscopic level a “realistic” interpretation of superpositions only works if a system includes unmeasurable components or we restrict what measurements we’ll make.

There’s a lot of material in this chapter so one could easily come up with some other highlights. In any event, here are my impressions of the chapter in greater detail…

Realist Statements vs Realist Philosophy
Instead of saying “I see a rock on the path” one could say “I know if I looked on the path to see if I would get the impression of seeing a rock there, I would actually get that impression.”

That would be cumbersome so we use “realistic” statements even if we don’t believe in hard-line realism. If we switch back to the microscopic realm realist-like statements might mislead.

Macroscopic Realism vs Quantum Universalism
If we assume quantum formalism is universal, then why don’t we see a rock in two places at the same time?

Macroscopic realism says macroscopic objects have mind-independent forms located in mind-independent places. So even before we look at it, a measuring device’s pointer will point to one and only one part of the dial.

A macroscopic state-vector therefore can’t be a quantum superposition A + B, and hence we can’t see a rock in two places at the same time.

Schrödinger Equation vs Macroscopic Realism
The problem is that the Schrödinger equation will often demand such a superposition. Realists respond by using something other than state-vectors to describe macroscopic objects.

D’Espagnat says that he showed (in 1976) that such attempts will fail, and a somewhat more general proof was found by Bassi and Ghirardi (in 2000).

Antirealism vs Macroscopic Realism
A different approach is to follow Plato and Kant. The senses are unreliable and deceive us. There’s no distinction between Locke’s reliable “primary” qualities and the less reliable “secondary” qualities.

The only thing certain are the quantum rules that predict our observations. All else is uncertain.

Probability vs Determinism
However, we don’t experience the world as a sequence of probabilistic predictions. We picture objects with definite forms, and we can predict the behaviour of these objects using classical laws that are deterministic.

Textbook Realism vs Quantum Predictive Rules
Part of the problem is that textbooks talk about the mathematics (including symbols for wave forms) as if they represent physical states that “exist” whether or not we’re taking a measurement.

D’Espagnat notes the same old difficulties of realist interpretations will  then reappear. He says symbols for the wave forms and other values should instead represent “epistemological realities.” They signify possible knowledge once the observer makes an observation.

In other words, the quantum rules predict observations, they don’t describe unobserved realities.

Absorbed vs Released Particles
In chapter four d’Espagnat assumed that a measured electron gets absorbed by the measuring instrument. In practice this rarely happens.

If the electron gets released, then the instrument and the electron form a “composite system.” Instrument and electron are “entangled” (in the quantum sense).

Composite States vs Measurements
If an electron is in a quantum superposition of two states, the instrument dial shows just one of those states (which you can confirm by using a second instrument to measure the first instrument).

If you test an “ensemble” of identical states all at once then some of your instruments will show one state while others will show the other state.

Note that the measurement points to the state of the electron after it’s measured, not before.

Measurements vs Quantum Collapse
Some physicists who won’t accept “weak objectivity” or mere “empirical reality” see the measurement process as “collapsing” a “real” wave function.

Quantum Collapse vs Quantum Universality

A quantum collapse is a “discontinuous” transition from the (differential hence continuous) Schrödinger equation.

If the quantum laws are universal, then what’s so special about a measuring instrument to produce this collapse?

Moveable Cuts vs Realism
Using the “von Neumann chain” idea, one can predict observations by placing a “cut” between observer and observed at various points. There’s nothing special about one particular instrument.

The cut may be placed between a measuring instrument and the particle, or between a second instrument (measuring the first instrument) and the first, or between a third instrument and the second, and so on.

Von Neumann showed that the results will be the same no matter where this cut is placed.

The problem is that the realist believes in a mind-independent reality, so presumably this cut should be in one and only one place. The collapse of a quantum system shouldn’t be at the whim of the observer (and his mind!).

Longing for Realism vs the Practice of Operationalism
D’Espagnat says a lot of physicists suffer from a kind of logical “shaky balance.” They want to believe in realism but in their working methods they use “operational” methods (which therefore don’t require a belief in realism).

Schrödinger’s Cat vs Quantum Superposition
Getting back to the composite system of instrument and electron, if the electron was prepared by a superposition of two states, then the composite system is represented by aA + bB. The small letters represent the “states” of the electron, and the big letters represent the states of the instruments.

But the measuring instruments will point to A or B on the dial, not both at the same time. Schrödinger imagined a cat that’s dead or alive depending on the results of the experiment.

We don’t see an instrument pointing to two parts of the dial simultaneously, nor can we imagine the cat is both dead and alive simultaneously.

Quantum Superposition vs Probabilities
The measuring instruments will show one result each time. Quantum rules predict the probability that a particular result will be seen, not that several results will be seen at the same time.

Probabilities vs Ensembles
To test probabilities we can create a really large ensemble of identical conditions and see what results we get. Imagine we create a whole lot of composite systems with an entangled electron and measuring instrument.

On each of those instrument dials we’ll measure one result or another, not both, and not something in between.

Identical States vs a “Proper” Mixture
Staying with the electron that was prepared as a superposition of states, we calculate a percentage probability that we’ll measure that electron as “being” in one specific “state” and another probability it’ll “be” in another “state.”

What if instead of a large number of identical states and identical measuring instruments we prepare some electrons in one state and some others prepared in the other state? We’ll determine how many of each by the predictions for the superposed state.

If we then just measure, say, position, we’ll get (approximately) the same results as predicted for the superposition of states. But if we try measuring something other than position our results may violate these predictions.

So unless we ignore everything but position, measurements on our ensemble of electrons in superposed states will differ from our proper mixture of electrons in pure quantum states.

Coherent vs Decoherent Measurements
Imagine we measure an entangled system of an electron (with states in superposition) and an atom. Then an ensemble of identical superposed states cannot be approximated by a “proper mixture” of separate pure states.

But if the atom and electron interact with a molecule that is too complex to measure, our measurements of the electron–atom system will be the same whether we measure an ensemble of identical states or a proper mixture.

The system has become “decoherent.”

Electron–Instrument vs Electron–Instrument–Environment Systems
It’s already hard enough to measure the “state” of an electron using an instrument. If we try to measure the “state” of the electron and the instrument in relation to the environment then we have a big problem.

Macroscopic vs Microscopic Energy Levels
A macroscopic object’s energy levels are very close to each other, so a very small disturbance from its environment (or its internal constituents) will shift its energy level.

Measurement Imprecision vs Quantum Precision
There is thus so much environmental influence on an instrument that we cannot measure the “state” of the instrument and electron as a system in the same way we were able to measure just the “state” of the electron.

That’s why we can’t perform an experiment similar to our earlier one that found differences between measurements on the ensemble of superposed states and the proper mixture of separate pure states.

Therefore an instrument pointer, which is a macroscopic object, will act like it’s in a single state, not a superposition.

Ensembles vs Double-slit Experiments
In the “Young slit experiment” we imagine a particle source, a barrier with two slits, and a detector screen (see chapter four). Normally the screen would show fringe-like patterns because of the quantum system’s wavelike nature.

However, if you add a dense gas to the area between the barrier and the detector screen then you’ll just see two “blobs,” therefore showing no evidence of wave-like interference.

The molecules in front of the screen are analogous to the molecules that are near an electron–atom system. The molecules form part of a system but are not themselves measured. In both cases we lose the effects of superposition.

Independent vs Empirical Reality
Because the insertion of unmeasurable molecules prompts us to infer distinct beams with distinct states (corresponding to the “up” or “bottom” slit), this shows how decoherence creates the illusion of a macroscopic reality.

D’Espagnat acknowledges it’s a bit artificial to make this distinction since we know about the particle source. But it reminds us that decoherence is what provides the illusion of an independent reality, although it’s really just an “empirical” reality.

Entanglement vs Reduced States
If one system gets “entangled” with another (such as an electron with an atom) then each system loses its own distinct wave function. There’ll now be a wave function for the combined system.

But the quantum formalism allows some information about the original system to be recovered if we imagine a large ensemble of its replicas. The mathematics that represents this is called a “reduced state.”

Quantum Prediction vs Decoherence
Imagine an ensemble of grain sands or dust specks. They’re small but still macroscopic. The quantum formalism predicts these small objects would be enough to produce the macroscopic effects in the Young slit experiment.

And the quantum formalism also predicts that these objects will act macroscopically, supporting the role of decoherence in creating the illusion of a macroscopic reality.

Reduced State vs Localization
The matrix mathematics used to describe the reduced state suggests the reduced state can stand in for an infinite number of proper mixtures of pure quantum states, which threatens the idea of locality. Fortunately at least one of those proper mixtures is composed of quantum states that are localized.

Experimental Superposition vs Decoherence
In experiments by Brune et al. a “mesoscopic” object is put into a superposition of states. In the brief time before environmental interactions introduce decoherence, the object’s quantum properties can be observed.

The experiments therefore provide evidence both for decoherence and for the validity of quantum laws in objects larger than microscopic.

Quantum Universality vs Classical Laws
Brune’s experiments support quantum universality, but it would be good if we could also show how to derive the laws of classical physics from the rules of quantum prediction.

Classical Numbers vs Quantum Operators
In classical physics various properties of an object (such as a table’s length) are represented by numbers governed by classical mechanics. In quantum physics these properties are represented by (Heisenberg) operators and obey quantum equations.

Roland Omnès has proved that the observational predictions of both approaches coincide (in classical physics’ traditional domains).

Quantum Laws vs “Reifying by Thought”
Because classical physics and their predictive formulas are so reliable in the macroscopic realm we naturally infer that past objects and events have “caused” present ones, and present ones will “cause” future ones.

Counterfactuality vs Quantum Mechanics
Counterfactuality depends on locality, but Bell’s Theorem combined with the Aspect-type experiments show that nonlocality, and hence counterfactuality, is violated (relevant if we’re realists).

If we want to show classical and quantum predictions are the same in the macroscopic realm then we’re going to have to figure out how to “recover” the counterfactuality we imagine macroscopic reality possesses.

Is there action-at-a-distance with macroscopic darts? It turns out their orientation is a macroscopic variable that “washes away” microscopic variations.

In fact orientation is one of the “collective variables” that includes length, mass, and other classically measurable quantities. We’ve already noted that Omnès showed their values are consistent with quantum formalism.

Macroscopic Certainty vs Microscopic Uncertainty
Measuring a “complete set of compatible observables” will give you the state vector that “exists” after all the measurements were made, but that doesn’t help you figure out the state vector that “existed” before you made any measurements.

The idea of a measurement is usually that it measures something previously existing. By that standard you can’t figure out a state vector for sure no matter how many measurements you make.

By contrast, the mathematics behind a macroscopic ensemble’s “reduced state” will tell us which physical quantities may be measured without disturbing the system. We can therefore recover the “state” of a macroscopic member of that ensemble.

D’Espagnat says this ability helps shed light on our intuition that the properties of something must have been the same before we looked at it.

Realism vs Semirealism
D’Espagnat will discuss those who still cling to realism in the next chapter. However, he says there are “semirealist” approaches that manage to stay faithful to the quantum formalism.

A and B vs A or B
The “and–or problem” arises because when we measure a system of superposed states aA + bB we see it as either in state A or in state B, not in both states A and B at the same time. This shift from “and” to “or” is nowhere suggested in the equations. D’Espagnat suggests this is a conceptual not a mathematical issue.

One vs Many Realities
The mathematics of quantum formalism does not require there just be one and only one reality. Everett’s “relative state theory” interprets this formalism to suggest that the universe “branches off” when a superposed system is measured.

In a given branch only one of the superposed “states” is measured, but the overall multi-branch system is still represented by the same expression that combines superposition plus entanglement: aA + bB.

Common Sense vs Formalism
Some physicists are attracted to Everett’s branching universes because it agrees with the quantum formalism. They believe that following the formalism first rather than common sense could bring in a revolution similar to relativity’s own repudiation of common sense.

Zurek vs Reality
Zurek showed that the “reduced state” of a macroscopic ensemble is stable under certain measurements. He goes further and defines “reality” as whatever is out there that remains stable under such measurements.

Quantum Universality vs Classical Foundations
Decoherence theory tips the balance away from thinking classical physics is somehow more foundational than quantum physics. Decoherence theory shows how the rules of classical physics may be derived from quantum rules.

Physics vs Chemistry, Biology, and Other Disciplines
Decoherence theory can’t let us predict the structure of other disciplines though. The quantum formalism has to be simplified “by hand.” Quantum theory is still universal, but our human choices, our human ways of conceiving things, will crucially guide our perceptions.

The Antirealist’s Reality

1 March 2011

Ultimate reality

The Invisible Hand
Chapter seven of Bernard d’Espagnat’s On Physics and Philosophy is a kind of grab bag, entitled: “Antirealism and Physics; the Einstein-Podolsky-Rosen Problem; Methodological Operationalism.”

D’Espagnat’s points in this chapter seem to boil down to this:

  1. Physics (and science in general) is about predicting observations not describing some kind of reality
  2. Operationalism (which concentrates on methodology) increases the reliability of science as it counters critics who complain scientific theories (which they say should describe and explain reality) keep changing, and
  3. Although measurements (of “empirical” reality) depend on the observer, physical laws seem to be constrained in various ways (by the structure of an “ultimate” reality that’s scientifically indescribable).

This chapter feels a little scattered as d’Espagnat pre-emptively defends himself against a bevy of incoming realist missiles.

In the end, though, he’s an antirealist in terms of empirical reality, and a realist in his belief there’s an ultimate reality that’s (probably) beyond our direct knowledge but nonetheless influences the shape of our everyday reality.

Here’s some more detail…

Unconscious vs Conscious Antirealism
D’Espagnat says modern physicists (ever since Galileo) generally use an antirealist approach in their methods even if they don’t explicitly embrace antirealism as a philosophy.

Mind-independent Realism vs Pythagorean Ontology
Objectivist realism claims there’s a mind-independent reality whose contents resemble our observations.

A Pythagorean Ontology (capital “O”) claims there’s a mind-independent reality that is reachable through deeper mathematical truths.

Unlike either of these approaches, modern physics emphasizes instruments and measurements. It’s not very interested in saying what’s “really” out there in the “world,” whether physical or mathematical.

Meaningful Statements in Classical vs Quantum Physics
While done more intuitively in the past, physicists nowadays can more formally apply “meaningfulness conditions” to statements.

Also, quantum systems are so peculiar that certain distinctions need to be made. Antirealist statements have to be expressed and tested in special ways.

Facts vs Contingent Statements
D’Espagnat is concerned here not with general “factual” statements such as “Protons bear an electric charge” but rather with satements about physical quantities. A value is assigned to the speed of a particular object, for instance.

True/False Statements vs Meaningless Statements
Based on Dummett’s approach a statement about an object’s speed would be meaningful only if we can measure (at least in principle) that physical quantity at some specified time and place.

Necessary vs Sufficient Grounds for Meaningfulness
D’Espagnat says Dummett’s criterion is necessary, but that doesn’t mean it’s sufficient. Other conditions may need to be fulfilled.

Imagining vs Measuring a Quantity
It’s possible that we can conceive of a physical quantity that has no meaning. However, if we can measure it then that quantity will definitely have meaning.

Classical vs Quantum Measurements
In classical physics it’s intuitive to think a measurement reflects the “true” values of an object, but in quantum systems the measurement of a particle (depending on your model) either creates or changes the values that you’re trying to measure.

In quantum physics we’re not simply “registering” some pre-existing value when we take a measurement. So the “truth value” criteria will need to include more than just measurability.

Disturbing vs Non-disturbing Measurements
In the spirit of antirealism D’Espagnat introduces a test: for a statement to have a truth value “it should be possible” (at least in theory) to measure the required physical quantity without disturbing the system.

The Einstein–Podolsky–Rosen trio claimed in 1935 that in some cases there are indirect ways to make non-disturbing measurements, admittedly only on correlated systems.

Correlated Darts vs Photons
If you throw a pair of correlated darts (see chapter three) they originally have some identical orientation. Measuring one dart’s value after they become separated will tell us the other dart’s value. As a bonus, the measurement won’t even change that other dart’s orientation.

If instead of darts you use correlated photons, and instead of measuring orientation you measure the polarization vector’s component at some angle, then you run into a problem.

Consistent vs Broken Correlations
If you measure one photon’s component at a certain angle then you can be sure if you measure the other photon’s component at the same angle you’ll get the same value (which will simply be “plus” or “minus”).

Because we are capable of making this measurement then by our meaningfulness test we can tell if a statement about those values is true or false.

But quantum formalism says the system of these two photons can have just one value at a time. We can’t measure one photon at a particular angle, then measure the other photon to measure another angle’s polarization component.

Multiple Values vs Bell’s Inequalities
At least we can’t then claim the second photon has simultaneous values at two different angles. The first measurement destroys the original correlation.

Because Bell’s inequalities have been disproved experimentally, we know that these multiple values don’t exist simultaneously.

And because our original meaningfulness test implied such a simultaneity we know that test is flawed.

Actual vs Possible Measurements
If we instead require that measurements are available rather than merely could be available then we get a stricter test. By phrasing our requirements in the indicative not the conditional we end up with a sufficient condition, not just a necessary one.

Possible Measurements vs Observational Predictions
Dummett’s meaningfulness test is a very general antirealist approach. It doesn’t look at the factual data actually available in a microscopic situation. It just considers our ability to make measurements in principle.

D’Espagnat says the tighter requirements he’d impose take an approach even further along the antirealist path as they speak of observational predictions not measurements. This also takes us further down the path of instrumentalism.

Operationalism vs the Value of Science
D’Espagnat says if you understand operationalism properly then you’ll realize operationalism confirms the value of science and makes its statements more reliable.

Description vs Prediction
D’Espagnat says critics of science believe scientific knowledge is easily influenced by social and cultural factors, and is frequently throwing out old theories for the sake of very different new ones.

Superficially this makes sense. Einstein’s curved space-time replaced Newton’s gravitational force. They’re radically different approaches.

But science isn’t trying to describe reality. It’s trying to make predictions about observations. Newton’s approach makes good predictions in its own domain, but in other domains Einstein’s predictions are the only ones that work out.

Sometimes the predictions and domains can be identical. Fresnel’s and Maxwell’s theories of light make the same predictions. D’Espagnat says the value of Fresnel’s theory was independent of whether the ether was really out there.

If you drop the naïve realism and its concern for description, then science as a method for synthesizing and predicting experience is not so inconsistent.

Now we can see steady progress as science gets better and better in its power of prediction.

Scientific Knowledge vs Practicality
D’Espagnat says science is mainly knowledge. Even if science is  concerned with prediction and not description, don’t confuse science with the various practical uses it’s put to (such as technology).

Descriptive vs Instrumentalist Knowledge
Science brings together an account of human experience that can be communicated: “If we do this, then we observe that.” Just because it’s not trying to describe “reality” doesn’t mean it’s not imparting some kind of knowledge.

Instrumentalist vs Theoretical Knowledge
These methods of making observational predictions are at the core of science. Coming up with a theory to define certain terms and describe certain entities can be useful, but that’s something added onto this predictive foundation.

Operationalism vs Instrumentalism
D’Espagnat doesn’t try to distinguish the two terms. He says the most important aspect of any theory that conforms to this approach is that it’s an instrument of making observational predictions. He says mathematical physics is a prime example.

Open Realism vs Endless Possibilities
In chapter five D’Espagnat talked of his preferred approach of “open realism.” Certainly our view of “reality” (specifically its physical laws) depends on us, including our ability to make observations. But there seem to be “constraints” on what kinds of theories are valid.

Describing vs Acknowledging Constraints
This “something else” that lies beyond our observations but somehow constrains them may not be directly accessible by us, but D’Espagnat says our inability to describe the constraints does not mean they don’t exist.

Ultimate vs Empirical Reality
An elusive, indescribable “ultimate reality” may still shape the physical laws that we describe. In turn the laws we infer are shaped from our observations that contribute to our sense of “empirical reality.”

Explanations vs Theories
D’Espagnat quotes one critic of operationalism, Mario Bunge, who says that the main role of a theory is to provide an explanation. Therefore a theory must provide at least a “rough sketch” of reality as it is.

D’Espagnat replies that the explanation would actually lie in the ultimate reality that constrains our physical laws, but this ultimate reality is not scientifically describable. Therefore what Bunge desires is impossible.

Unless we grant that “miracles” happen all the time there appear to be constraints on our physical laws. But the ultimate reality producing these constraints can’t be scientifically described because of the problems with objectivist realism noted before.

Physics vs Physical Objects
D’Espagnat says that Bunge considers a value in physics attached to something that is not physical is meaningless. If the value doesn’t refer to something “real” then it’s pointless.

D’Espagnat points out that many physical laws refer to values that are not attached to existing physical objects. Probability is a concept referring to either imaginary objects or is a thought not subject to physics.

Particles vs Waves
Also, wave functions are useful, in fact, essential for quantum physics. So are wave functions real? If so, then particles would have to be real too. If waves and particles exist simultaneously then we’d have to accept the Broglie–Bohm model with all its problems (see chapter nine).

Also, a ground-state electron in a hydrogen atom would seem to have zero momentum because it’s not changing state (quantum potential is balanced by Coulomb force). But the Compton effect shows momentum is non-zero. We have two different versions of momentum. If they were both “real” then we get into pointless difficulties, says d’Espagnat.

Other possibilities: waves change into particles (but the collapse of the wave function has lots of problems attached to it) or only waves exist (but then nonseparability and measurements cause problems).

So D’Espagnat says Bunge’s objections seem pretty “dogmatic.”

Circular vs Practical Definitions
Another objection notes (correctly, d’Espagnat acknowledges) that operationalists place a lot of emphasis on precise definitions, but Bunge says some concepts will remain undefined (just like a dictionary uses some undefined words to define other words).

D’Espagnat replies that operationalism is a methodology, not an “a priori” philosophical system. We want efficiency. Dictionaries are useful despite their undefined terms. Some concepts we just seem to naturally know (whether they’re born with us or not).

These undefined concepts (though neither certain nor absolute) let us operate a measuring instrument, for instance, which then lets us define other concepts.

Sometimes concepts considered “primary” in the past get defined explicitly, such as Einstein’s replacement of “absolute time” with a time that’s partly relative to the observer.

Measurement vs Change
The act of measurement seems to change the quantum system. If, as Bunge’s approach would suggest, this change is “real” then we’d have the difficult problem of explaining this change.

But the quantum approach is “weakly objective” so it refers only to measurement. In the end theoretical entities are useful for helping to make predictions in modern physics. Just don’t regard them as self-contained and “real.”

Einsteinian Hope vs Descriptive Failure
Einstein and those of a similar optimistic bent believed reality would be increasingly describable. This view does not seem consistent with the reality that the quantum framework paints.