Download Transcript of the Philosophical Implications of Quantum Mechanics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Elementary particle wikipedia , lookup

Quantum tomography wikipedia , lookup

Implicate and explicate order wikipedia , lookup

Quantum fiction wikipedia , lookup

Matrix mechanics wikipedia , lookup

Propagator wikipedia , lookup

Faster-than-light wikipedia , lookup

Classical mechanics wikipedia , lookup

Aharonov–Bohm effect wikipedia , lookup

Coherent states wikipedia , lookup

Quantum field theory wikipedia , lookup

Quantum entanglement wikipedia , lookup

Quantum mechanics wikipedia , lookup

Quantum vacuum thruster wikipedia , lookup

Quantum chaos wikipedia , lookup

Relativistic quantum mechanics wikipedia , lookup

Path integral formulation wikipedia , lookup

Atomic theory wikipedia , lookup

Quantum tunnelling wikipedia , lookup

Quantum potential wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Measurement in quantum mechanics wikipedia , lookup

Renormalization group wikipedia , lookup

Ensemble interpretation wikipedia , lookup

Quantum electrodynamics wikipedia , lookup

Bell's theorem wikipedia , lookup

Probability amplitude wikipedia , lookup

Wave function wikipedia , lookup

Photon polarization wikipedia , lookup

Fundamental interaction wikipedia , lookup

Quantum state wikipedia , lookup

Quantum logic wikipedia , lookup

Canonical quantization wikipedia , lookup

Wave packet wikipedia , lookup

Old quantum theory wikipedia , lookup

Double-slit experiment wikipedia , lookup

T-symmetry wikipedia , lookup

Introduction to quantum mechanics wikipedia , lookup

EPR paradox wikipedia , lookup

Relational approach to quantum physics wikipedia , lookup

Uncertainty principle wikipedia , lookup

Matter wave wikipedia , lookup

Bohr–Einstein debates wikipedia , lookup

Interpretations of quantum mechanics wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

Copenhagen interpretation wikipedia , lookup

Hidden variable theory wikipedia , lookup

Transcript
Transcript of the Philosophical Implications of Quantum Mechanics
22/6/04
This week I want to look at the main interpretations of Quantum
Mechanics and the philosophical issues raised by them. This will mainly
consist of conceptual and metaphysical problems, but due to popular
demand I’ll also say something about the ethical implications of the
interpretations (if there area any). But before I do this though it will be
necessary to for me to give a brief overview of the emergence of QM as
the dominant paradigm within Physics.
Historically QM emerges as a response to the suggestion by Max Planck
that energy came in packets or quanta. This being the only logical
conclusion possible following the observed emission of heat from black
boxes. Classically a black box was an idealised thermodynamic object
that was supposed to emit energy as continuous electromagnetic
radiation, but Plank found that in reality it only emitted energy
discontinuously, in certain small, quantifiably distinct emissions or
quanta. From this he concluded that energy was released in the form of
packets, or particles, in other words as photons (the smallest packet being
a single photon of variable energy). This insight led to a Noble prize for
Planck and Niels Bohr’s development of the modern theory of the atom,
complete with its corresponding electron energy levels. This formal
quantification involved something called Planck’s constant, a number that
proportionately related the energy and radiant frequencies of all possible
energy quanta.
From this observation it was possible to conclude that, as Newton had
insisted, light itself consisted of particles, an identity that had always
been controversial due to its parallel wave-like properties, but now these
so called ‘wave descriptions’ could be reduced to particle descriptions via
Planck’s equation. De Broglie declared that the wave description was in
fact an illusion, produced by the photon’s rapid wave-like trajectory
under the restrictions of Planck’s equation. Similarly Schrödinger
calculated that the possible orbits of an electron around a nucleus (its
energy levels) could be understood in terms of the wave-like path of the
electron in its orbit (simply put, only whole numbers of waves fitted into
an orbit were allowable. The same restriction being applicable to the path
of a photon between source and target). The particle nature of light was
further supported by Einstein’s claim that photoelectric effects (the
production of electricity from light) could only be explained by photons
of the appropriate energy punching electrons out of their orbits, all in
accord with Planck’s theory and Bohr’s model atom. For a while
everything seemed straight forward. Unfortunately light wasn’t playing
ball, and it continued to demonstrate the properties of a wave, with some
of these (such as interference and tunnelling) only explicable in terms of a
wave theory rather than as particles moving on a wave-like trajectory.
Physics was in chaos. It seemed that all the evidence pointed equally to
light being both a wave and a particle. For a while the
dual existence of a guiding pilot wave and a guided particle was seriously
posited, even though this infringed the convention of parsimony (and
arguably any kind logic). But soon this explanation was to be superseded.
The next puzzling discovery was Heisenberg’s Uncertainty Principle.
This was derived from the fact that while it was long known that certain
properties of subatomic particles could not always be measured with
accuracy, it came as a surprise that certain paired, and arguably mutually
exclusive, properties, such as momentum (a dynamic measure) and
position (a static measure), or paired up and down particle spin, were
closely linked in their uncertainty, and in way oddly proportionate
through Planck’s constant. This meant the more we knew about one the
less we knew about the other. But what this means is itself uncertain,
initially it was thought that it was a physical problem that reflected the
fact that photons of various energy bouncing off the particle under
measurement effected their position (the Planck correlate being the
indication of this). However, despite still being given as an explanation
in some text books, this was soon found not to be the case. While this
explanation may intuitively hold for position and momentum, it cannot
hold for other pairs such as time and energy or the polarization of light.
As no other inference seemed possible the only conclusion was that the
uncertainty of the properties was not an epistemic or practical problem
but was actually a metaphysical or ontological uncertainty, that is the
properties didn’t definitely exist until measured, and this measurement
made the other property in the pair even more ontologically vague. To
many this seemed absurd, but when experimental science achieved the
technology to examine the quantum world more closely, and novel
experiments such as the triple polarizer were set up, it was found to be
almost certainly the case. Though as we shall shortly see die hard realists
continue to deny this possibility despite all the apparent evidence for it.
The classification of such paired properties included the position and
momentum; horizontal and vertical polarisation of light;
and temporality and energy state of a single particle, and the up or down
spin of paired particles. Neil’s Bohr called these pairs
kinematic-dynamic complementaries. Declaring that none of these
properties actually existed in any defined way until measured.
They were not properties of the object observed, he insisted, but rather
properties of the relationship between the measuring device and what was
being measured (whatever that meant). Bohr, and others, further claimed
that it followed from this, and the theoretical problem with the ontology
of light, that another mutually exclusive complementary relationship
existed, one between the wave and the particle nature of light. This was
thought to be closely related to the kinematic-dynamic complementaries,
in that momentum was a wave property and position a particle property,
making them both mutually exclusive. As Bohr had also argued that light
was not only both a wave and particle phenomena, but that it could not be
both simultaneously, as that would result in a direct logical contradiction,
and was also demonstratable by the fact that momentum and position
could not be measured simultaneously in the same experiment. It
followed from this that all quanta, including electrons, also had a dual
nature. He thus explained the Uncertainty Principle itself in terms of the
dual nature of quantum objects. The actual situation was later shown to
be more complex however, for while the idea, of a quanta not having no
definite state before measurement, was proven by the triple polarizer
experiment, another recent experiment has allegedly demonstrated a
simultaneous wave and particle nature for a photon at a single stage of an
experiment. But I shall return to that and the problems it raises later.
Initially all this was seen by most scientists and philosophers as crazy
speculation within a fringe branch of physics, and probably wrong.
Particularly as their was no real deterministic theory, with a solid
mathematical base, that could describe, and at least partially predict, all of
this in a satisfactory fashion. This was to change with the development of
Quantum Theory and its mathematical formulation. The maths itself is
somewhat complex, and I’m not a mathematician, but I’ll attempt to give
an outline of what is involved here.
The key to this was the discovery that associated with Heisenberg’s
Uncertainty Principle was another factor, non commutability.
Non commutability means that a x b is not equal to b x a (contrary to
normal mathematics), specifically within Heisenberg’s equation
momentum multiplied by position gives a different result to position
multiplied by momentum, why is still a total mystery, but gave him a clue
in finding the mathematical basis for quantum mechanics. The only form
of mathematics Heisenberg knew to be non commutative was Matrix
math, the involving the formulation of arrays of numbers. By representing
the various states of a quantum system in terms of tables or arrays, as
multidimensional vector space matrices, Heisenberg found he could
model all the phenomena seen in quantum experiments and formulae,
including most importantly their non commutability aspect. There was
now a mathematical theory in place that could model and partially predict
quantum events which Heisenberg called Matrix Mechanics.
The problem was that in incorporating the Uncertainty Principle it
predicted several possible outcomes for an experiment in terms
of probability, which while in accord with Heisenberg’s discovery in
terms of complementary pairs was hard to square with reality as the
equation was supposed to be a description of what actually happened in
the world. Similarly Schrödinger developed his wave theory to describe
the totality of quantum phenomena and discovered something similar.
That while a definite state of a quantum system could be described in
terms of a wave function, what actually matched experimental results was
a superposition of two waves overlaid on each other, with only one of the
two waves representing the actual outcome according to certain
probability weightings. In other words two quantum states were
superimposed on each other, until a measurement was made, meaning for
instance a given particle had both mutually incompatible up AND down
spin, before it was measured to have down spin (an event with a 50%
probability), in which case it only had down spin. This was no surprise to
Niels Bohr who had expected this kind of result, though he insisted it was
meaningless to talk of a system actually being in both states, and so
Schrödinger began to see his waves as not being the physical energy or
matter waves postulated by de Broglie but as probability waves. Each
mapping out the possible states of a quantum system. Soon after, the
mathematician Hilbert was able to demonstrate that Matrix Mechanics
and Wave Mechanics were in fact mathematically identical. Later Von
Neumann produced an even more influential mathematical model for
quantum systems based on Operator algebra. An Operator (mapping a
vector or property) was something that caused a change in a mathematical
Function (here the Quantum State), and like Matrices were the only other
mathematical entities to be non commutative and so suitable as a
mapping tool. This system was similar to Matrix Mechanics, in that it
mapped states in vector space, and also mathematically equivalent to it. It
described quantum systems in terms of potential Eigenvalues each with
possible Eigenstates, equivalent to the superposition of wave functions,
when a definite Eigenstate was produced from a set of Eigenvalues it was
said to have been projected out of the equation by an Operator. It was
easier to handle and modelled quantum states more naturally as dynamic
states evolving over time, and so became the dominant model of QM.
This form of mathematical modelling has proven to be the only scientific
model to exactly reflect the behaviour of its corresponding physical
reality (other classical models, such as Newton’s equations of motion,
being only approximate reflections of the reality they attempt to map). In
addition the Quantum Mechanical model has been used in a wide variety
of technological applications and proved totally reliable. Therefore, while
some claim not all of its predictions have been experimentally
demonstrated, the fact that it perfectly describes phenomena that have
been observed is taken as evidence for it being the most successful
scientific theory to date. This is unfortunate because it is totally
counterintuitive, describes bizarre illogical phenomena and cannot be
visualised. What was left to complete QM was its interpretation, and this
is where the problems begin.
Not everyone agreed that interpretation was necessary however, Richard
Feynman counselled that QM was totally incomprehensible to everyday
reason and so Physicists should give up trying to interpret and stick to
calculating. In other words to be scientists rather than philosophers. This
very incomprehensibility can be taken as further proof of a Kantian take
on science, that is that our ordinary theories and models are conceptual
tools rooted in our own cognitive abilities and mental categories rather
than mirrors of reality, and reality itself while corresponding in some way
to these models was not necessarily knowable to the human mind.
Mathematics being the only way to reach it. Despite this Feynman did
come up with a very interesting insight into the nature of the Quantum
Reality, though he described this in metaphorical terms rather than
theoretical ones. Basically he agreed with Bohr that quantum states were
in some sense unreal until measured, therefore when a particle was
confronted with two slits in a screen it actually went through both
simultaneously (Bohr differed slightly on this point arguing it went
through both and neither and each one separately, as such talk was
meaningless). It followed from this Feynman claimed that if a screen had
ten slits in it then the particle would follow all ten paths simultaneously
and if there was an infinite number of slits there would be an infinite
number of paths taken. In reality in a situation with no slits, a normal
distance between two points, there were effectively an infinite number of
paths, therefore a particle took every possible path between two points.
However most of these paths mutually cancelled out the only one that
didn’t being the most direct path. But oddly the particle on this direct path
could interact with itself on any of the other paths! What this meant was
unclear Feynman simply said it ‘smelled’ out each possible path, taking
each before deciding to take whatever path was the most direct, so while
only one path was empirically measurable the other paths were equally
real.
It was Niels Bohr who created the dominant interpretation of QM
however, the one which became known as the Copenhagen Interpretation.
In its simplest form this states that the wave function with its
superpositions is the most useful model and should be seen as a
probability wave mapping out all the possible states and trajectories of a
particle or wave. Bohr developed his notion of Complementarity to flesh
this out further, arguing that it was impossible to measure complementary
concept pairs simultaneously (be they position and momentum, or wave
and particle) because these were logically incompatible concepts that
were both equally necessary to describe the reality of a quantum object,
even though neither actually mirrored it. They were he said technically
jointly necessary for a complete description but logically incompatible, or
in short were mutually exclusive but jointly complete.
Some argue that this is a completely contradictory notion, but in fact
contradiction was the very thing that Bohr was trying to avoid in his
descriptions. He sought two self contained descriptions or acts of
measurement that gave totally incompatible results but that were both
equally true. Contradiction didn’t arise because these two were never
mentioned in the same description. What Bohr seems to be getting at is a
Kantian point that these descriptions are mental constructions that
imperfectly match reality and that the notion of contradiction is not
something that exists in the real world but rather is a feature of our
consistent representation of it.
So while he saw non-contradiction, and so meaning, as an important
feature of our coherent representation of the world, he did not extend this
to reality itself. Naturally this was an anathema to Realists who believed
non-contradiction to be a feature of the world itself, Bohr in turn may
have regarded this as Platonistic. The issue is not clear-cut and perhaps is
something we can discuss later.
A side effect of this kind of thinking is the possibility of incompatible
theories both being true if they complement each other and
are supported by evidence. Thus Bohr was not only able to support Ernst
Mach’s views on a neutral monism beyond incompatible foundationalist
theories of mind and matter (that is idealism and materialism), but to also
support the theory of vitalism alongside more conventional theories of
mechanistic biology. A deep problem for Bohr today is the fact that
experiments have now detected
light behaving as both a wave and a particle simultaneously, an apparent
descriptive contradiction in the making. Though the interpretation of this
experiment is itself open to question.
An even more controversial aspect of Bohr’s interpretation was of the
nature of measurement, which he insisted defined the actual parameters
of a system non-existent before the measurement. This was formulated
most strongly by Von Neumann who invented the concept of ‘wave
collapse’ or the ‘projection postulate’. What Von Neumann realised was
that the wave equation with its superposition of states described all the
possible outcomes as equally real, but experience told us that after
measurement only one state was actually real. He therefore argued that
the wave function didn’t hold after measurement it in some way
‘collapsed’, loosing information and projected the actual result into the
world. Therefore physics was incomplete it needed measurement to make
it so.
For this reason Bohr claimed that a complete physical description was
only possible when a measurement was made, therefore science was not
describing a set of objects with inherent properties but a set of relations
between measured objects and measuring systems and the properties
which emerged were properties of those relations not the objects under
consideration. Light was neither a wave or a particle these were concepts
produced by the way we detected objects and built up concepts from this
data, the real objectivity of light was something other. Thus Bohr
proposes a kind of inter-reduction in science in which bridging laws, in
the form of translation equations, can convert descriptions of light for
instance from wave theory talk to particle theory talk, without claiming
that either is fundamental or ontologically real, and admitting that under
certain circumstances translation may be impossible.
Some interpret Bohr as a mild idealist who was claiming that the act of
measurement gave reality to light, that the description was the reality,
however it seems to me that he was describing a phenomenal reality
rather than a noumenal one. Unfortunately Bohr lacked the philosophical
training to clearly formulate what metaphysical point he was really
making. But either way his views were an anathema to Realists. More
problematic than this ideological prejudice is that Bohr inherits all the
problems associated with Kant’s philosophy, particularly the connection
between phenomena and noumena.
Even more heretical to realists were the spin on Bohr’s ideas given to
them by the Princeton School around John Wheeler. Wheeler
correctly maintained that the term ‘measurement’ was far too vague to be
included in a scientific theory, and needed sharpening up.
He argued that measurement was inseparable from observation and so it
was therefore human consciousness that led to the probability wave to
collapsing into a single reality. This meant the physics was incomplete
without consciousness of somekind.
Naturally this was unacceptable to most Realists as they argued that this
was absurd given that consciousness had evolved after the physical world
came into being, a physical reality that needed to be complete in order to
produce consciousness. Wheeler gets round this by invoking backwards
causation and claiming we create our own past through observation in the
present. This causes
certain problems for our notion of time but is not quite as daft as it
sounds. Another factor that supports it is the conclusion of recent
philosophers of mind that consciousness is a primitive essential feature of
the universe that is not reducible. How this would exactly fit into
Wheelers interpretation of QM is uncertain however. Many people reject
Wheeler’s interpretation as they think it leads to Idealism and from there
to the God hypothesis. But this is far from clear, for one the Observer
Effect does not have to create reality it merely collapses a more complex
reality into a simpler one, and even id Idealism did underlie quantum
ontology, this would not necessarily lead to an external deity, who after
all if quantum reality is the norm would have to be seriously disturbed by
human standards, but to the almost deification of human consciousness.
Though this in itself is as unacceptable to the religious as it is to the
materialistic.
The Measurement Problem as it is known is not only solvable through
calling on consciousness however, and most physicists would reject this
interpretation. The real question is what is it about measurement that
causes wave collapse. It might be the observation component, but more
physicists believe it is something prior to this, found in the act of
measurement itself. But given that we have already rejected the
disturbance theory of measurement with regard the Uncertainty Principle,
what could this be.
One theory suggests a disturbance of a more subtle kind, such as
decoherence theory. This is a complex notion but basically it says that
reality is in fact quantum mechanical and Schrödinger’s cat really is dead
and alive in some sense all the time. However when
we limit the information available about a quantum system it seems to
behave like a classical system. Thus because we cannot know the exact
quantum state of a cat (it being too complex) it appears to us in a more
simple way as a classical object that is either dead or alive. However a
simple quantum system like an electron pair is knowable in its true
quantum state of superposition, but when we make a measurement we
introduce it to a complex system of which it becomes a part and thus
reverts to a classical system. Thus the measurement appears to collapse
the probability wave. But the problem here is obvious, the wave isn’t
really collapsed so the world is till really in a superposition and the
classical world an illusion. This seems absurd, but the only way out of
this problem seems to be to claim our perception of reality changes it,
which leads us back to the problematic role of consciousness.
We can’t decide what aspect of measurement is responsible for the wave
collapse however, due to the limitations of QM its actually impossible to
find an observable difference that would distinguish between an observer
effect and a decoherent measurement effect
and so the difference is academic and we are left in a philosophical dead
end here. If some evidence of the role of consciousness in the world was
demonstratable (such as the alleged effect of consensus belief on the
crystallisation of glycerine documented in chemistry journals at the turn
of the century) we might be able to conclude in favour of the observer
effect, but another problem stands in the way of this, the problem of
scale.
It is commonly accepted that quantum uncertainty only operates at the
microscopic levels and not macroscopic levels. But why this should be is
actually a mystery. The orthodox explanation was once the statistical law
of scale, while weird non-classical phenomena may happen at the
quantum level, such as the brief appearance of a photon (or even an
electron) from nowhere, due to the uncertainty relation between energy
and time, this was such a rare and isolated occurrence that it is swamped
by more common classical behaviour observable at the macro level.
However this not now widely accepted, one reason for this is
disagreement over the interpretation of probability involved, or the nature
of the quantum state. It is quite possible to define the quantum state of an
insect given enough computing power, and given that it is not beyond the
realms of probability for that insect to exist in two places at once, given it
has a quantum state no different to that of a microsystem (it is just very
unlikely given classical statistics). This seems too leaky for some, but
there seems no easy way to limit the quantum indeterminacy of objects to
the micro level, and what’s more even if there were, given complexity
theory, it is quite possible to conceive of a situation (not unlike the
butterfly effect) in which a quantum fluctuation in a complex
macrosystem effects the entire structure (the loophole Roger Penrose uses
to import Quantum phenomena into the classical world). It is here that
decoherence theory becomes useful again. Many now appeal to this
theory to explain the cut off point between the quantum level and the
macro level, simply based on the amount of information needed to define
a quantum state. As we have seen we simply don’t have enough
information available to see a macro object as anything other than a
classical object, so for us that is exactly what it is. But here we return to
the same problem as before, either this means our perception changes the
nature of the reality of these objects or our perception is an illusion and
the objects really are in superposition all the time.
This last point leads many physicists and philosophers to reject the idea
of the wave function collapse out of hand, and to claim the function
remains true of the world at all times. The strong interpretation of this is
that the uncertain properties really do have a definite state it is just that
we do not know it. A compelling argument for this is science can not
contain something that has not been observed under experimental
conditions, and while the wave function has been invariably correlated to
observed phenomena no one has ever seen a wave collapse, or even
understands what it might mean.
The simplest way of denying the wave collapse is through the postulation
of the ontological reality of the quantum world. The most
conservative form of which is Karl Popper’s Propensity Theory. Popper
argues that the quantum world is relatively normal in that there really are
definite electrons with real properties which only pass through one slit in
a screen etc. The only oddity is the existence of a propensity field which
influences the behaviour of the particles, a phenomenon we see as
probability. Thus in an interference experiment each photon travels
through a different slit to its predecessor in a way that builds up an
interference pattern,
all under the influence of probability. Most philosophers regarded this
idea as crazy, as it incorporates a new force into physics, the
force of probability. Something that seems an unnecessary addition to the
say the least. But taking the theory at face value we can say it allows us to
make sense of QM by postulating an underlying reality not described in
the wave function and regarding the quantum equations to be mere
probability calculating tools rather than reflections of reality. Thus a
weak form of Copenhagenism can be maintained in that the wave
function accurately describes an abstract world of possibility. This kind
of theory is thus called a ‘hidden variable’ theory. Another kind of hidden
variable theory is that of David Bohm. Bohm’s ideas are highly complex
and not a little vague, but what he suggests is essentially a reworking of
de Broglie’s pilot wave theory into a field theory, for Bohm particles are
real entities in an ‘explicate order’, but which emerge from and are
guided by a universal field which constitutes an ‘implicate order’. Some
properties are explicate in that they belong to distinct particles, other
properties are implicate in that they belong to the universal field and not
to any isolated particle. Through this shared field every event is thus
implicated in every other event and a Realist holism emerges. Details of
what this precisely means are sketchy however.
All these hidden variable theories however have been disproved by the
violation of Bell’s inequalities as predicted by the famous theorem of
John Bell. This theorem was literally a test for hidden variables, we do
not need to go into the details here other than to say Bell set out a series
of conditions, technically referred to as inequalities, that applied if an
underlying reality in accord with normal concepts of probability also
applied. In other words if a real underlying reality existed and the wave
function really was a tool for calculating the probabilities of events within
the context of this reality, then the Bell inequalities applied and could not
be violated. A few philosophers argue against this view, but the vast
majority except it. Unfortunately for Realists Bell’s inequalities
were violated in the famous Aspect experiment. This demonstrated the
apparent non-locality of QM, in that an entangled pair of particles were
shown to influence each other instantaneously regardless of their
distance. If one of the pair was measured to have
an up spin then though quantum mechanical laws the other particle
instantly acquired a down spin even if it was a light year or more away.
This for one violated Relativity Theory in demonstrating instantaneous
effect or non-local causation something impossible if
an underlying physical reality underlay quantum reality. Thus it was
concluded that there was no underlying physical reality in this sense.
This sunk most hidden variable theories, but Bohm claimed immunity
through his claim that while position was a real property,
determining a particle as real, other properties such as spin were field
properties and not real in the everyday explicate sense. Thus
Relativity and Bell’s inequalities were not violated. This is interesting but
Bohm’s theory is not explicit enough to elaborate on this
point and so smacks of having a cake and eating it for most people.
The Realists did not give up here however. They used the results of the
Aspect Theory to add to the list of evidence of an incompatibility
between State Reduction theories (wave function collapse) and Relativity
theory, as the former happily accepted
Non-locality as a consequence of their understanding of quantum
mechanics, namely that state reduction operated as if space-time
did not exist. Rejecting both hidden variable theories and state reduction
theories the Realists thus developed the most conservative approach to
QM yet devised, Relative State theory.
Relative State theory was first set out by Everett in his famous paper on
the topic. Unfortunately his definition of it was not precise enough to
determine what it actually entailed and there are currently five different
interpretations of it, these include two so called
many-world interpretations, a many-histories interpretation, a manyminds interpretation and a modal interpretation.
The essense of Everett’s thesis is that the wave function accurately
describes the state of the world and never collapses, instead what happens
is that we only experience one of the possibilities described. Supporters
say this is the only interpretation that followers directly from experience
and the quantum maths itself and imports no other concepts, therefore by
Ockham’s razor must be true. However what this interpretation consists
of is problematic. The classic answer is attributed to de Witt, who simply
says it means that the universe splits into multiple worlds when a
measurement is made and the observer splits with it, so that we are
just one observer in one branch of this hypothetical multiverse measuring
just one possibility indicated in the wave function. This has inspired
many good science fiction stories, but what it entails is too much for most
philosophers to take seriously. To start with and ignoring the obvious
extravagances of the thesis, what does it mean to say the universe splits?
No one has ever observed the universe splitting, so why should we
include it as a postulate if we are to exclude other unobserved postulates
like wave collapse.
Further more it totally disrupts our concept of time, in fact time does not
exist under it, either as a continuous dimension in our universe or as a
flow, all that remains is continual division and branching of the universe.
But most serious of all it infringes the very basis of science, which is
supposed to be about producing generalisations that correspond to the
observable universe as testable in a laboratory. In contrast the manyworlders are claiming that the wave function applies in the most part to
other universes not in contact with ours, that cannot be tested in the
laboratory.
To counter these claims David Deutsch came up with a version of manyworlds state relativism that allowed interaction between worlds, arguing
that this was demonstratable in quantum physics. He thus argued that
many-world theory could be experimentally distinguished from state
reduction theories through his famous computer-brain idea. What
happens here is that a computer-brain observes the quantum world and
records a single measurement, if state reduction holds this process is
irreversible as one option now no longer exists, but if the universe has
split both options are available in that the two universes can be made to
interact through interference so that the quantum state is rescrambled and
the branches recombine. The computer-brain thus returns to its prior state
but has the record of an earlier single state in its memory. The manyworlds theory is thus confirmed. This is clever but is unfortunately totally
unsound philosophically. The very idea of two branches of the universe,
two essentially separate universes by definition, interacting is a
contradiction of terms. Further more in what dimension are these
universes supposed to branch given that only three dimensions of space
seem possible under existing physics.
To solve this problem many philosopher-physicists talk of a manyhistories hypothesis. In this version of state relativism there is not a
branching of worlds but a focusing on one history. This is similar to
Feynman’s ‘sum of trajectories’ hypothesis, in which a particle takes all
possible paths but only one becomes actual, but unfortunately is equally
unclear. The argument goes that all the possibilities in the wave function
represent equally real histories, but that we only experience one of these.
Why varies between versions of this idea, some say that only logically
consistent histories are stable enough not to mutual cancel, others that we
only have the capacity to perceive one of these histories. The latter
version is the most elucidated in that it calls on decoherence theory again
to show that it is our lack of information that causes us to perceive the
world classically and be ignorant of its quantum nature.
But this only recalls all the problems with decoherence theory, either it is
saying that the classical world is an illusion, which seems too weak, or
that our perceptions effect reality, which is too strong for most and
undermines the Realist agenda. Those that think the classical world is an
illusion have developed the many-minds interpretation. Here it is not the
universe that splits but the mind of the observer. Thus our local conscious
mind is only away of one possibility, where as all the others still exist but
are only accessible to unconscious parts of our mind to which we do not
have access. The problem with this is that it seems to work by shifting all
the philosophical problems away from quantum physics and into the still
unresolved theory of consciousness. A kind of sweeping under the carpet.
What a theory of consciousness would be like if the many-minds
interpretation held is incomprehensible to many.
For these reasons it seems all the candidates for relative state theory are
flawed. However one possibility remains, the modal interpretation of the
philosopher Baz van Fraassen. Here it is suggested that the possibilities
described by the wave function are modal states rather than real ones, in
that they describe possible worlds. What this means is that probability is
simply that which governs which world becomes the actual world. This
sounds plausible, though invokes worries about a return to Propensity
theory.
More seriously it is contrary to experimental evidence which
demonstrates the various possibilities in the wave function can interact
with each other. What this would mean are that the other possible worlds
are in someway real. If we want to avoid a collapse of this position into a
many-worlds theory, we thus have assume two levels of reality a potential
reality, a kind of ghost world, and an actual reality. The actual reality is
the one we experience on measurement. The problem with this approach
is that it seems to be equivalent to a version of Bohr’s theory.
One last attempt to come up with an interpretation of QM is known as
Quantum Field theory. The motivation behind Quantum Field theory is
the reconciliation of QM with Relativity theory. But it also has other
secondary effects that resolve some of the problems of QM. There are
many versions of Quantum Field theory one of the most interesting more
or less dispossesses of particles altogether, and postulates that each point
in spacetime is a quantum system in its own right. Thus the ontology is of
a universal field and particles are explained as excitation states in the
point systems, that migrate across the system as waves in the greater
field.
This is not only appealing in that it solves the wave particle duality
problem, but in that it reconciles wave mechanics and relativity
equations. There is still a problem in that any wave collapse cannot be
reconciled with Relativity, but some seek to solve this by again denying
the existence of the wave collapse, thus adopting another conservative
account of the formalism of QM. This is an attractive option but it has a
lot of technical problems. Paramount amongst these is its complexity,
many versions of it are too complex to practically compute, and those that
are seem to imply the denial of ‘infinities’ through a ‘renormalization’
process and other mathematical ‘fiddle factors’. Even if these technical
problems can be overcome Quantum Field theory doesn’t solve all our
philosophical problems. While it removes wave particle
Complementarity, the dynamical forms of Complementarity involved in
the propagation of energy within the field remain. That is the wave
trajectories within the field (what ever it turns out to be) are still in
superposition, which is a particular problem if we deny wave collapse.
Therefore many of the philosophical problems associated with
superposition remain.
To end this week I want to say a little about the ethical problems effected
by our interpretation of QM. I will say more about this next week but I
think it will be useful to give a brief precursor tonight. Basically the issue
is one of free will. Obviously the exact problem depends on our view on
moral philosophy itself, which would demand a separate series of lectures
in its own right,
however I will simplify things by stating that in my opinion the only issue
in ethics that really matters is the exercise of individual free will, without
which there can be no morality at all. The problem with some versions of
QM (for instance the many-worlds hypothesis and Bohm’s theory) is that
they are strictly deterministic. They allow no room for free will, thus if
we adopt them we have to give up all conception of morality. While the
indeterministic versions do not necessarily help with free will as they are
based on random chance rather than choice, some varieties of QM are
distinctly helpful. In particularly I am thinking of those interpretations,
such as Bohr’s, which involve a Kantian view of the world. It is well
known that Kant’s philosophy is amongst
the few philosophies that actually allow for free will to exist. Thus if we
want to retain our moral view of the world we need to
hope that the Kantian versions of QM turn out to be true.
I shall elaborate a little more on this next week, and comment on other
aspects of morality effected by QM, when I look at the larger picture of
Realism and QM and speculate on what actually may be going on in the
world.
A summary of interpretations (from Wikipedia)
Interpretation Deterministic?
Waveform
One
real?
Universe?
Avoids
Avoids
hidden Local? collapsing
variables?
wavefunctions?
Copenhagen
interpretation
(Waveform
not real)
No
No
Yes
Yes
No
Yes
Copenhagen
interpretation
(Waveform
real)
No
Yes
Yes
Yes
No
No
Consistent
No
No
Yes
Yes
Yes
Yes
Histories
Consciousness
causes
Collapse
No
Yes
Yes
Yes
No
No
Everett manyworlds
interpretation
Yes
Yes
No
Yes*
Yes
Yes
Bohm
interpretation
Yes
No
Yes
No
No
Yes
Also in 1927 Bohr stated that space-time coordinates and causality are
complementary