Download Conclusion

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Conclusion
I have made no more progress in the general theory of relativity. The
electric field still remains unconnected. Overdeterminism does not work.
Nor have I produced anything for the electron problem. Does the reason
have to do with my hardening brain mass, or is the redeeming idea really
so far away?
Einstein to Ehrenfest, 1920
Despite the spectacular successes of Einstein's theory of relativity, it is sometimes said
that tests of Bell's inequalities and similar quantum phenomena have demonstrated that
nature is, on a fundamental level, incompatible with the local realism on which relativity
is based. However, as discussed in Section 9.6, Bell's inequalities apply only to strictly
non-deterministic theories, so, as Bell himself noted, they do not preclude "local realism"
for a fully deterministic theory. The entire framework of classical relativity, with its
unified spacetime and partial ordering of events, is founded on a strictly deterministic
basis, so Bell's inequalities do not apply. Admittedly the phenomena of quantum
mechanics are incompatible with at least some aspect of the intuitive metrical idea of
locality, but this should not be surprising, because (as discussed in the preceding
sections) the metrical idea of locality is already inconsistent with the pseudo-metrical
structure of spacetime, which forms the basis of modern relativity.
It's tempting to conclude that while modern relativity initiated a revolution in our
thinking about the (pseudo-Riemannian) metrical structure of spacetime, with its singular
null rays and non-transitive equivalencies, the concomitant revolution in our thinking
about the topology of spacetime has lagged behind. Although we long ago decided that
the physically measurable intervals between the events of spacetime cannot be accurately
represented as the distances between the points of a Euclidean metric space, we continue
to assume that the local topology of the set of spacetime events is Euclidean. This
incongruous state of affairs may be due in part to the historical circumstance that
Einstein's special relativity was originally viewed as simply an elegant interpretation of
the existing Lorentz ether theory. According to Lorentz, spacetime really was a Euclidean
manifold with the metric and topology of E4, on top of which was superimposed a set of
functions representing the operational temporal and spatial components of intervals.
It was possible to conceive of this because the singularities in the mapping between the
"real" and "operational" components along null directions implied by the Minkowski line
element were not necessarily believed to be physical. As the validity of Lorentz
invariance was just being established "one order at a time", it wasn't clear whether it
would be valid to all orders. The situation was somewhat akin to the view of some people
today who believe that, although the field equations of general relativity predict a genuine
singularity at the center of a black hole, we may imagine that somehow the laws break
down at some point, or some other unknown effect takes over and the singularity is
averted. Around 1905 people could think similar things about the implied singularity in
the Lorentz transformation between "real spacetime" and the operational electromagnetic
spacetime, i.e., they could imagine that the Lorentz invariance might break down at some
point short of the singularity. On this basis, we can make sense of continuing to use the
topology of E4. Hence the original Euclidean topology of Lorentz's absolute spacetime
still lurks just beneath the surface of modern relativity.
However, if we make the judgement that Lorentz invariance applies strictly to all orders
(as Poincare suggested and Einstein brashly asserted in 1905), and the light-like
singularities of the Lorentz transformation are genuine physical singularities, albeit in
some unfamiliar non-transitive sense, and if we thoroughly disavow Lorentz's underlying
"real spacetime" (which plays no role in the theory) and treat the "operational spacetime"
itself as the primary ontological entity, then there seems reason to question whether the
assumption of E4 topology is still suitable. This is particularly true if a topology more in
accord with Lorentz invariance would also help to clarify some of the puzzling
phenomena of quantum mechanics.
Of course, it's entirely possible that the theory of relativity is simply wrong on some
fundamental level where quantum mechanics "takes over". In fact, this is probably the
majority view among physicists today, who hope that eventually a theory uniting gravity
and quantum mechanics will be found which will explain precisely how and in what
circumstances the classical theory of relativity fails to accurately represent the operations
of nature, while at the same time explaining why it seems to work as well as it does.
However, it may be worthwhile to remember previous periods in the history of physics
when the principle of relativity was judged to be fundamentally inadequate to account for
observed phenomena. Recall Ptolemy's arguments against a moving Earth, or the 19th
century belief that electromagnetism necessitated a luminiferous ether, or the early-20th
century view that special relativity could never be reconciled with gravity. In each case a
truly satisfactory resolution of the difficulties was eventually achieved, not by discarding
relativity, but by re-interpreting and extending it, thereby gaining a fuller understanding
of its logical content and consequences.