Download hanson.pdf

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Timeline of astronomy wikipedia , lookup

Outer space wikipedia , lookup

Observational astronomy wikipedia , lookup

Wilkinson Microwave Anisotropy Probe wikipedia , lookup

International Ultraviolet Explorer wikipedia , lookup

Dark energy wikipedia , lookup

Universe wikipedia , lookup

Hubble Deep Field wikipedia , lookup

Theoretical astronomy wikipedia , lookup

Big Bang wikipedia , lookup

Ultimate fate of the universe wikipedia , lookup

Astronomical spectroscopy wikipedia , lookup

Shape of the universe wikipedia , lookup

Fine-tuned Universe wikipedia , lookup

Non-standard cosmology wikipedia , lookup

Cosmic microwave background wikipedia , lookup

Physical cosmology wikipedia , lookup

Transcript
VISUALIZING COSMOLOGICAL TIME
Andrew J. Hanson, Chi-Wing Fu, and Eric A. Wernert
Computer Science Department
Indiana University
Bloomington, IN 47405 USA
[email protected], [email protected], [email protected]
Abstract
Time is a critical aspect of visualization systems that invoke dynamic simulations and animation. Dealing with time at the scales required to conceptualize
astrophysical and cosmological data, however, introduces specialized problems
that require unique approaches. In this paper, we extend our previous investigations on interactive visualization across extremely large scale ranges of space to
incorporate dynamical processes with very large scale ranges of time. We focus
on several issues: time scales that are too short or too long to animate in real
time, those needing complex adjustment relative to the scale of space, time simulations that involve the constant finite velocity of light (special relativity) in an
essential way, and those that depend upon the dynamics of the coordinate system
of the universe itself (general relativity). We conclude that a basic strategy for
time scaling should ordinarily track the scaling of space chosen for a particular problem; e.g., if we are adjusting the interactive space scale, we scale time
in a similar way. At cosmological scales, this has the interesting consequence
that the time scale adjusts to the size of each era of the universe. If we make a
single tick of the viewer’s clock correspond to an enormous time when viewing
an enormous space, we see motions in viewer’s time of increasingly larger, and
usually appropriate, scales. Adding interactive time-scale controls then permits
the user to switch the focus of attention among animations with distinctly different time features within a single spatial scale. Objects may have an entire time
hierarchy of alternate icons, with different representations for different time-step
scales, exactly analogous to the choice of spatial level-of-detail models.
Keywords:
Visualization, virtual reality, astronomy, time
1.
Introduction
Exploring large scale data sets that have a time component requires special
attention to temporal scaling and representation issues. In our recent work
Hanson et al., 2000 on large spatial scales, we described a multilevel approach
to static data sets that were not necessarily large in the quantity of data, but
were large in the number of orders of magnitude of spatial scale required in the
1
2
Figure 1.
The Earth with time-streaked satellite animations and static stars and galaxies.
representations. Implicit in our treatment was the question of causality: viewing astronomical objects at very large scales requires a number of assumptions
about when we are actually seeing such objects due to the constant finite speed
of light. No attempt was made to deal rigorously with animation and time evolution and their relationship to physical models accepted by astronomers and
cosmologists. In this paper, we propose a family of techniques for dealing with
animation and time visualization at huge scale ranges to fill these gaps.
Our understanding of the nature of the physical universe has improved in recent years, and cosmological concepts have undergone a rapid evolution (see,
e.g., Layzer, 1984; Bernstein, 1998; Carroll, 1999b). We have generally accepted that the large-scale relationships and homogeneities that we see can
only be explained by having the universe expand suddenly in an “inflationary”
jump to the true vacuum state in its earliest moments. We also have resolved
Olber’s paradox — the fact that the night sky appears black when, in a uniform
universe, it would be brilliantly lit — by invoking the Hubble expansion, so
that all the galaxies are flying away from each other, leaving enough of a gap
to make the night sky dark. Furthermore, we can even attribute different rates
of this expansion to domination of different cosmological processes, with the
very recent observation that we may be currently entering a new period of exponential growth dominated by the Cosmological Constant — the vacuum state
of the universe — rather than radiation or matter Carroll, 1999a. The physical
processes of the very young universe are believed to be accurately described by
the “standard model” of elementary particle physics, with plasmas of quarks,
gluons, leptons, and photons gradually cooling down to form hydrogen, helium, and lithium nuclei in precise ratios. The only step in this process we
can measure directly is the final stage, where the temperature finally decreases
Visualizing Cosmological Time
3
to the point where enough protons and electrons combine to form stable neutral atoms; light could then escape and travel freely into the future. In this
era, about 400,000 years after the Big Bang, the mean path length of radiation increased significantly, and the universe became gradually transparent to
the “flash” of electromagnetic radiation at 3000 K that we see today, after a
thousand-fold Hubble expansion, as the 2.7 K cosmic background radiation.
In this work, we focus on the question of visualizing the Universe relying
as much as possible on observational data. We can in principle treat the eras
prior to the onset of transparency to radiation, and microscopic scales down to
the Planck length of 10;35 m and the Planck time (the travel time for a light ray
across the Planck length) based on theoretical models of currently unobservable phenomena; however, the great variety of problems involved in building
intuition about the spacetime phenomena we can actually observe experimentally provides sufficient challenges, as we shall see shortly.
In the following sections, we begin with a synopsis of the issues and techniques required to deal interactively with large scale ranges in static systems.
Next, we attempt a taxonomy of the issues one must address when adding
physical time-dependence by simulating dynamical systems at wide ranges of
spacetime scales. Techniques for maintaining precision, visualization strategies, and implementation methods are next, including constrained navigation
techniques and our scripting language for defining objects in multiply scaled
representations. Finally, we explore the unique challenges of cosmological
time itself and show a series of images with our visualization results based on
observational data.
2.
Static Large-Scale Methods
The visualization of large-scale astronomical data sets using animations that
are predetermined, as opposed to interactive, has long been a subject of fascination. Several books and films have approached the subject, dating from Kees
Boeke’s monograph “Cosmic View: The Universe in Forty Jumps” Boeke,
1957 and the classic film “Powers of 10” Eames and Eames, 1977; Morrison
and Morrison, 1982 by Charles and Ray Eames to the recent Imax film “Cosmic Voyage” Silleck, 1996, which exploits many recent complex astrophysical
simulations.
In order to go beyond these films, we must provide the viewer with additional modes for interacting with large scale spacetime data sets. Earlier work
by the present authors, Hanson et al., 2000, attacked the problem of handling
the dimension of space in such a way that rescaling did not incur numerical
error, and so that very distant objects could be replaced by simple yet accurate representations. Specific techniques for dealing with the extremely large
volumes of information embodied in recent data (see, e.g., Williams et al.,
4
1996; Fan et al., 1999; Geller and Huchra, 1989; Cox, 1999) have also attracted
attention (see, e.g., Ostriker and Norman, 1997; Song and Norman, 1994).
The basic philosophy of our approach in Hanson et al., 2000, is to confront
both the numerical errors in large scale transformations and the problem of
efficient data representation for an interactive system; the Universe becomes
far more interesting if we can explore it ourselves without the constraints of a
fixed animation. Our approach to solving these problems includes:
The definition of a scale-driven data language to support multiple representations.
The incorporation of constrained navigation to allow some limits on the
volumes of the data sets and their viewable regions.
A pixel-size-driven switching mechanism between full 3D data renderings and approximate environment maps.
Severe limits on the actual scaling performed on any data set. The key
idea is to define all data at unit scale and to transition among representations using scale tags corresponding to logarithms of the true magnitudes.
3.
Time Representation Issues
In attempting to extend the techniques for handling wide ranges of scales
in static environments to dynamic environments, we encounter both general
issues and scientific issues having to do with our principal proving ground, the
representation of astrophysical and cosmological environments.
The universal issues that we encounter, and which one would need to handle
in many normal situations, include:
Screen Time. We deal in this paper mainly with real-time interactive environments, either on the desktop or in a virtual reality simulation. The
human viewer has his or her own specific perceived time scale, typically quantized in units of screen refresh time; we must on occasion also
account for the human input time scale and human perceptual limitations. The relation between the simulated time and the viewer’s “screen
time” is then the critical issue for animations. The Universe is billions
of years old; major steps in biological evolution happen on scales of
millions of years; Pluto takes 249 years to orbit the Sun; light takes 9
minutes to reach us from the Sun; molecules undergo chemical reactions in microseconds; currents flow through computer chips in less than
a nanosecond. To supply intuitions that assist the visualization of dynamic processes over such a range of time scales, we must rescale the
Visualizing Cosmological Time
5
intervals between time-dependent changes in the simulation so they are
appropriate to the viewer’s expectations and perceptual abilities.
Slow Objects. Just because an object is moving so slowly that it cannot
be perceived as moving on the real-time graphics screen does not mean
that we do not need to know it is moving. In fact, such objects may have
extremely significant motions in the context of the entire visualization,
and therefore we may need somehow to annotate their past and future
positions, however slowly they may move in screen time.
Fast Objects. Conversely, for a given choice of screen time, some contextually significant objects may nevertheless be moving far too fast to
display meaningfully in the visualization. They may not appear on the
screen at all, or perhaps may appear in a single time-sampled frame and
disappear, and yet it may be important to know that they passed through
the field of view and are on their way elsewhere.
Numerical Error. Time simulations over large scale ranges lead to serious problems with numerical accuracy. We must therefore adjust the
actual animation scale to match the user’s observable scale in such a
way that inappropriately large or small numbers do not occur. For example, in an initial test, we had difficulties when trying to animate the
Earth’s rotation with the animation scale corresponding to the rotation
speed of the entire Milky Way; we found that we were rotating the Earth
at such a rate that we hit the machine precision limit in one step. Thus
we typically need recursive time rescaling, similar to the recursive space
rescaling adopted in our earlier work on large-scale static spaces.
Some specific problems encountered in astrophysical and cosmological applications (which could, in principle, have analogs in other applications) include these:
Periodic Motion. Classic problems of signal processing sampling theory occur whenever an object is moving periodically, either rotating in
place, orbiting about a specific center, or even following a chaotic, nearly
repetitive, path. When the sample rate is too low, completely anomalous
stroboscopic effects appear and must be handled. We typically encounter
this problem for rotating and orbiting heavenly bodies.
The Speed of Light. In nature, the speed of light is finite. Even if the
theory of special relativity were simpler and did not intermingle space
and time in different reference frames, we would have to deal with finite
signal time propagation and Doppler effects resulting from the finiteness
of the speed of light. Our main application will in fact focus on the observation that correct visualizations of experimental data in astrophysics
6
depend at every juncture on the finite speed of light, long before the
effects of special relativity become important.
Special Relativity. When an observer is moving with high velocity, or if
we are comparing the observations of measurements in different frames
with velocity differences near the speed of light, we need to incorporate
the transformation laws of special relativity. We have done little that
specifically requires this in our present applications; however, the “Mr.
Tompkins” approach (Gamow et al., 1999), in which the scale of spacetime is adjusted to exhibit relativistic phenomena at screen-time scales
could potentially be very interesting pedagogically in our visualization
environments.
Cosmological Time. When performing astrophysical simulations at extremely large time scales, there are spatial scaling phenomena due to
the experimentally-verified effects of Einstein’s theory of general relativity with a Robertson-Walker metric. Pedagogically, there are astrophysical conventions that compensate for these changes for the sake of
intuitive constancy in a visualization, and also conventions that directly
implement the scale changes to illustrate the general relativistic effects.
Supporting scientifically acceptable visualizations of cosmological time
effects is a central objective of this paper.
4.
Techniques for Maintaining Precision
We found recursive scaling of spatial modeling and representations to be imperative in Hanson et al., 2000. When direct modeling techniques are used in
typical systems with OpenGL hardware support, the hardware matrix transformations begin to exhibit significant precision errors at magnitudes around the
square root of the maximum floating point number, and multiple-level rescaling methods are required to compensate. Even if much longer arithmetic word
lengths were supported, it would still be much better programming practice to
control hierarchies of data sets so that both their bare representations and their
approximate depiction choices at different scales are under explicit designer
control. This was accomplished in Hanson et al., 2000, using unit-scale representations for all data sets, combined with pre-rendered 2D environment maps
for viewpoints that could not detect 3D motion of portions of the data.
4.1
Scaled Time Representations
There are similar problems in the time domain; in fact, even closed-form
algebraic time-dependent motions can be very problematic when a single time
scale is used and the elapsed time for the simulation becomes very long. We
therefore extend our spatial scaling philosophy to the time domain:
Visualizing Cosmological Time
7
1 Replace the time range of each dynamically-changing object by a scaled
range that controls the possible numerical errors.
2 Assign scaled moving frames that remove translational motion to periodically moving objects, and have their periodic motion expressed in a
“rest frame” using a standard time scale plus an additional unit, relating
the local time unit to the logarithm of the time scale in global units.
3 Use a limited number of “time scale frames” so that simulations occur in
a small time scale range when the object is moving at modest speeds in
screen time, and, outside of this range, approximate the object’s motion
using a hierarchy of representations.
4 Replace the object’s simulated motion by alternative representations that
involve no actual time stepping when the object is out of simulatable
range, e.g., a million orbits occurring in one screen-time step are represented as a ring, not a single time sample.
4.2
Adjusting the Time to Scales of Space
The second issue that is important to us addresses the question of adjusting
the range of space on the screen in an astrophysical display. As we move our
spatial viewpoint back, say, from 1 meter, with a 1 meter square field of view,
to 1014 meters, with a 1014 -square-meter field of view, it makes no sense to
keep the scale of time at one screen second per true second. Nothing moves in
user time at the scale whose dynamics are now of interest to us; a ray of light
would take hours to traverse a single pixel. Thus we ramp up the screen-time
scale so that the observable motions in one observer’s second correspond to
something of interest; these scales may not change uniformly, and so our data
representation allows for customized transitions among time scales to be used
for simulations across the ranges of astrophysical spatial scales. Typically, this
means that the speed of light is scaled as well to be extremely fast; unless there
is some very special observer-velocity effect that is of interest, special relativistic effects can be neglected while preserving visualization features depending
on the finite propagation time or Doppler shift of an observed light ray.
A typical example would be a view of the earth, with a view area of 107
meters having a 1000-pixel diameter on the screen, and the globe rotating at a
rate of one full revolution in 10 observer-seconds. In real time, it would take a
ray of light 107 =3 108 = 1=30 of a second, or one video-frame refresh-time,
to cross the space represented on the screen; with one revolution of the earth
taking 10 seconds instead of the real-life 86,400 seconds, light would travel
more than 8,000 screen-widths per second. This is a typical motion rescaling,
so we can often approximate the speed of light as being nearly infinite for
visualization purposes. Conversely, we will systematically choose scales of
8
time beginning from the earth that are proportionally adjusted to the scale of
space, so the simulated speed of light in screen time is very fast relative to the
viewed diameter of space.
4.3
Time-Keyed LOD Models
The use of multiple Level-of-Detail (LOD) models keyed to the size of a
rendered graphics object relative to a display pixel has been frequently explored (see, e.g., Reddy, 1997; Maciel and Shirley, 1995; Astheimer and Pöche,
1994; Hitchner and McGreevy, 1993). When we are dealing with large scales
of time, the scale key analogous to the pixel is the smallest time step, the screen
refresh time. In space, if a pixel does not change, the object is far away and
the simplest possible icon is usually appropriate; in time, if a rendering does
not change in one refresh time, a static representation of the dynamics is suitable. When the object changes by a few pixels, we are in the realm of standard
animation, where a changing object will appear to “move” because it does
not wander too far from its last pixel position in one time step. By checking
the displacement in pixels or the changed features of a moving texture (e.g.,
a rotating planet), we can determine whether more complex motion depiction
strategies are required. Our model definitions contain a time-keyed LOD that
invokes differing visualization strategies and object representations according
to a hierarchy of scales in displacement-per-refresh-time: as we have pointed
out, these correspond to no displacement, displacement by, say, 10 pixels, 100
pixels, 1000 pixels, and so on. For static spinning bodies, avoiding motion
sampling artifacts require a similar scaling for texture displacements of varying magnitudes.
4.4
Dynamic Constrained Navigation
The constrained navigation (CNav) methods treated in Hanson and Wernert, 1997, dealt implicitly with static environments such as terrain maps, molecules, and architecture. Dynamic environments such as time-dependent astronomy, molecular processes, weather, traffic patterns, and manufacturing require a more general approach. Since our application theme in this paper is
astrophysical simulations, we briefly note here some of the new constrained
navigation methods employed in our implementation.
The first modification is simply to add time-scale parameters to the field
of rendering parameters corresponding to each key-position on the constraint
manifold; just as we ramped up the spatial response in Hanson et al., 2000,
as we moved out to the outer reaches of the cosmos, we specify evolving, but
not necessarily uniformly increasing, time-intervals in order to intelligently
represent the motions of moons, planets, stars, and galaxies in the course of
the user’s travels.
Visualizing Cosmological Time
9
The unique addition that is required by an exploratory system is that one
be able to override the default scale, and scale the time interval up or down to
facilitate examining objects with different intrinsic animation speed within a
single view frustum.
Finally, we allow the designer to attach comoving constrained navigation
manifolds to moving objects of interest (the Earth, the Sun, etc.) in order
to hold such objects fixed within the view. In the present system, transitions
from one such moving system of frame manifolds to another are handled by
direct warping; we plan to return in later work to examine families of more
sophisticated approaches.
5.
Motion Visualization Methods
The main goal of this section is to study the representations of objects whose
motions we wish to visualize in a way that gives us useful intuition. The three
basic categories are animations with time scales close to the scale appropriate
to the screen time, those that are too slow for us to deduce the properties of their
motions, and those that move so fast that ordinary animation would produce
drastic sampling-error effects. Special attention will be paid to scales of motion
that are far out of step with the current screen time.
5.1
Screen-Time Objects
Figure 1 shows a simple example of the depictions of astronomical objects
whose motions span a range of scales that are still animatable at screen scales.
The Moon’s finite length “trail” serves as a visualizable velocity field, so even
in the still image, we can tell that the Moon was moving at a certain velocity relative to the overall scale. The Space Shuttle is the object represented
by the already circular blurred orbit trail; it is starting to exhibit sampling errors, and will have the icon turned off and completely replaced by the orbit
icon if we increase the rate of time evolution in this scene. The Earth itself
is making the transition to a motion-blur-anti-aliased texture as the sampling
rate per refresh is getting too close to the rotation period. The main point of
this multiple-component visualization is to emphasize that even objects that are
moving slowly enough to animate directly in screen-time intervals may have
diverse properties; differing velocities in particular need to be distinguished
so that the user can intelligently focus on interesting properties and exercise
appropriate time scale adjustment.
5.2
Slow Objects
In Figure 2, we show schematically how a moving object in the scene is
depicted if it has a significant past and future, but is static at the current time
scale. Fundamentally, we present a trace of the object’s predicted trajectory
10
Figure 2. An object that is too slow to show
motion in the current scale can be annotated
to show how many powers separate its motion
from the current screen time scale.
Figure 3.
Motion-blurred trails of objects
moving, orbiting, or rotating too fast for animation in current screen time. An example
implementation appears in Figure 1.
coupled with scale marks showing how many powers of 10 are needed to reach
a scale of visible motion starting with the current screen-time scale. The interactive interface will typically have one additional control that allows the user
to readjust the screen time. That is, the scene model itself together with its
navigation fields will have a “default” screen time attached to the scene corresponding to the viewpoint and spatial scale; since there may be many moving
objects in the scene that are of interest, but with vastly different time dependence, the user needs a way to adjust the time scale up and down to check the
intuitive behavior of each object. As this is done, of course, objects migrate
among the possible representation categories — a static object may start to
move in screen time, and vice versa.
5.3
Fast Objects
In Figure 3, we show a representation of an object that is moving too fast
to perceive continuity at the screen pixel level, one that is moving in a fast
periodic orbit, and one that is rotating too fast to see coherent surface texture
warping. Non-periodic paths are denoted by lines or curves with time scale
notation to indicate how to slow down the motion to render it simulatable at
screen scale. Periodic motions are represented as closed paths, again with a
time scale; periodically moving objects centered on a moving body may be
shown pedagogically as a single elliptical curve rather than the more correct
but less informative spiral. Interactive time scale adjustments can be made as
usual to match the screen time to a particular object.
The problem of reconstructing the path or appearance of a fast object and
representing meaningfully is closely related to the problem of digitally sampling analog data; the Nyquist theorem says that the sampling rate of a signal
must be at least twice the frequency in order to accurately capture the periodic content of the signal. The legendary problem of motion blur in computer
graphics, along with the classic “wagon wheel” stroboscopic freezing of rotat-
Visualizing Cosmological Time
11
ing wheels in films are essentially problems that violate the requirements of
accurate sampling; blending together many finely spaced samples to construct
a blurred image or comet trail is a classic approach to producing an intuitively
satisfying visual representation.
6.
Implementation and Design of the Multi-Scale
Representation
The features described in this paper have been implemented principally in
Iris Performer on SGI-based CAVEtm and CAVE simulator platforms. Individual scene sequences involving cosmological time (details in the following
sections) were implemented as sequences of single frames for an animation
that was the source of Figures 5–9 (see Hanson and Fu, 2000).
To implement scalable representations in our visualization system, we defined a two-level hierarchical model structure in addition to Performer. The
upper level is the scripting language (:aml file). In the scripting language, there
are basically four kinds of commands: CNav, modeling, animation, and path
commands. CNav-related commands define a set of CNav manifolds for user
to navigate with. Modeling-related commands load in object definition (:ob j)
files and instantiate the models in the virtual environment. Animation-related
commands define the interaction parameters (e.g., orbits) relating astronomical
bodies. While parsing the :aml file, the system builds up a graph to describe
the dynamics: the objects are nodes, and the interactions are edges. Traversing
the graph updates the positions of moving astronomical bodies. Finally, the
path-related commands define the center of attention for the virtual environment at each different scale; this allows us to change the center of navigation
naturally as, for example, we move out from the Earth, to the Sun, to the center
of galaxy, and so on.
It is worth noting that, since the virtual environment is constructed from an
object-based graph, it is very easy to update the animation, migrate centers of
attention, and switch between the comoving and physical cosmological coordinate systems to be described in the next section.
7.
Time in Cosmology
When we design visualizations at the scale of the visible universe, space and
time intermingle in a manner described by Einstein’s theory of general relativity. In this paper we focus solely on the visible universe and employ, as much
as possible, actual experimental observations. Astrophysicists and cosmologists commonly employ a radially symmetric form for Einstein’s equations, the
Robertson-Walker metric; following Carroll, 1999a, we can write this equation
12
in the form:
2
ds
= a(t )
2
dr2
2
2
2
2
+ r (dθ + sin θdφ )
1 ; kr2
; dt 2
(1)
to describe the qualitative features of the evolution of the Universe. When we
choose appropriate energy-momentum tensor assignments for the radiation,
matter, and vacuum-dominated eras such as those proposed in Carroll, 1999a,
we find that the Universe has been expanding, as Hubble observed, with an
effective radius a(t ) as shown in the Figure 4.
1.0
E
x
p
a
n
s
i
o
n
a(t)
0.001
TIME = t
0.0
0.0
0.5
Matter domination
0.8
1.0
Today
0.73
Vacuum domination
-5
3x10 Recombination (Great Flash)
-6
End of Radiation domination
1.6x10
-51
10
Inflation expands universe
The Big Bang
Figure 4.
The expansion of the Universe given in terms of its effective radius a(t ) as a
function of time t, in units of the age of the Universe.
Another variable, the redshift z, where
a(t ) =
1
1+z
;
(2)
is traditionally used to describe observed distant objects, their light, and their
estimated ages. We see that z = 0 corresponds to the present era, the time of
the most recent observable light. At the other end of the Universe, around
z = 1100, when the Universe was three orders of magnitude smaller in terms
of linear distance than it is today, the first visible light, sometimes called figuratively “The Great Flash,” was set free to reach our present-day instruments.
Visualizing Cosmological Time
13
Figure 5.
Expanding the Cosmic Clock Figure 6. Corresponding views of the Unifrustum to include the entire Universe.
verse from the tip of the cones at Left.
During this era, 300,000–400,000 years after the Big Bang, the temperature of
the Universe cooled sufficiently so that an increasing number of protons and
electrons could combine to make neutral hydrogen, and that in turn gradually
increased the mean path length of radiation so that transparent to light. As the
redshift decreased from z = 1100 to z = 0, the temperature characterizing this
radiation dropped from a brilliantly visible cauldron of light at over 3000 K
to the present-day 2:7 K. This is the cosmic background radiation, the earliest
measurable remnant of the birth of the Universe.
8.
The Cosmic Clock
Field of View Warping.
Figure 5 shows conical volumes representing
viewable portions of the Universe, with the age of the observable light rays
increasing logarithmically with the vertical distance from the bottom of the
cone. The image sequence morphs this “Cosmic Clock” tool from a limited
field-of-view camera frustum at the left to include the entire Universe at the
right; a constant-time sphere in the Universe is a disk-shaped slice in the righthand cone.
Volume Field Warping. Figure 6 gives an animation sequence corresponding to the view of the Universe from the tip of the cones in Figure 5. This
is accomplished simply by doing an angular warp of the range of spherical
coordinates of each object at constant radius.
Earth Centered View. In Figure 7, we show how to visualize the light coming from distant stars from an Earth-bound viewpoint. The rough 3D positions
for each object in the database are divided into color-coded distance-slices.
14
The color red is assigned to the most distant objects, white to the nearest. To
provide context, we juxtapose the Cosmic Clock, a volumetric cone whose contents represent the volume of the universe we can see with a camera pointed at
the night sky. Using comoving coordinates, which treat the Universe as though
it was always exactly the size it is today, the volume of visible space increases
continuously until we reach the origin of the cosmic background radiation,
about 13 billion years ago.
External View in Comoving Coordinates. In Figure 8, we show symbolic,
but physically impossible, views of the universe, as though we could watch a
spherical shell of light traveling back in time from our viewpoint on the Earth to
the earliest visible objects. As we reach each landmark — the earth, the solar
system, the Milky Way, and the most distant galaxies, we mark the scale in
meters as each group of objects enters the visible spherical shell. The Cosmic
Clock cone on the side has been modified — like Figure 5, it now includes
the entire Universe warped from a set of spherical shells to disk-shaped shells
layered to make up the cone. We are still in comoving coordinates, and all
distances are rescaled to the current size of the Universe.
External View in Physical Coordinates.
In Figure 9, of the expanding
sphere of light using the Robertson-Walker physical rescaling. To the left is a
warping of the cone in comoving coordinates to the true physical coordinates,
where the upper tip is at z 1100, the origin of the cosmic background radiation, and is about 1=(1 + z) times smaller than the Universe in our present
era. The bright light at the upper tip of the warped cone represents the 3000 K
temperature at this early time. The right hand images match those in Figure 8,
but are rescaled to physical coordinates by a(t ) in Figure 4.
9.
Conclusions
We have addressed the problem of effective interactive navigation across
huge ranges of time scales appropriate for interactively studying the entire
physical universe. Our methods included the use of multiscape data representations and the Cosmic Clock technique for representing relative cosmic
time scales as well as the Hubble expansion.
Acknowledgments
This research was supported by NASA grant number NAG5-8163. This
research was also made possible in part by NSF infrastructure grant CDA 9303189 and the generosity of the Indiana University Advanced Visualization
Laboratory. We thank P.C. Frisch, S. Carroll, D. York, D. Eisenstein, and
E. Kolb for their assistance with astrophysics and cosmology issues.
References
Astheimer, P. and Pöche, M.-L. (1994). Level-of-detail generation and its application to virtual
reality. In Proceedings of the VRST ’94 Conference, pages 299–309.
Bernstein, J. (1998). An Introduction to Cosmology. Prentice-Hall.
Boeke, K. (1957). Cosmic View: The Universe in Forty Jumps. John Day.
Carroll, S. (1999a). The cosmological constant. Living Reviews in Relativity. http://www.
livingreviews.org: a refereed electronic journal.
Carroll, S. (1999b). Cosmology for string theorists. http://pancake.uchicago.edu/
˜carroll/tasi99.ps.
Cox, A. N. (1999). Astrophysical Quantities. Springer-Verlag New York, Inc., New York.
Eames, C. and Eames, R. (1977). Powers of Ten. 9 1/2 minute film, made for IBM.
Fan, X., Strauss, M. A., and et al. (1999). The discovery of a high-redshift quasar without
emission lines from sloan digital sky survey commissioning data. Astrophysical Journal,
526:L57–L60.
Gamow, G., Stannard, R., and Edwards, M. (1999). The New World of Mr. Tompkins: George
Gamow’s Classic Mr. Tompkins in Paperback. Cambridge University Press.
Geller, M. and Huchra, J. (1989). Mapping the universe. Science, 246:897–910.
Hanson, A. J., Fu, C.-W., and Wernert, E. A. (2000). Very large scale visualization methods for
astrophysical data. In de Leeuw, W. and van Liere, R., editors, Data Visualization 2000,
pages 115–124. Springer Verlag. Proceedings of the Joint EUROGRAPHICS and IEEE
TCVG Symposium on Visualization, May 29-31, 2000, Amsterdam, the Netherlands.
Hanson, A. J. and Fu, P. C. (2000). Cosmic clock. Siggraph Video Review, vol. 134, scene 5.
Hanson, A. J. and Wernert, E. (1997). Constrained 3D navigation with 2D controllers. In Proceedings of Visualization ’97, pages 175–182. IEEE Computer Society Press.
Hitchner, L. and McGreevy, M. (1993). Methods for user-based reduction of model complexity
for virtual planetary exploration. Proceedings of the SPIE – The Int. Soc. for Optical Eng.,
1913:622–636.
Layzer, D. (1984). Constructing the Universe. Scientific American Books.
Maciel, P. and Shirley, P. (1995). Visual navigation of large environments using textured clusters. In 1995 Symposium on Interactive 3D Graphics, pages 95–102.
Morrison, P. and Morrison, P. (1982). Powers of Ten. Scientific American Books.
Ostriker, J. P. and Norman, M. L. (1997). Cosmology of the early universe viewed through the
new infrastructure. Commun. ACM, 40(11):84–94.
Reddy, M. (1997). Perceptually Modulated Level of Detail for Virtual Environments. Computer
science, University of Edinburgh, Edinburgh, Scotland. PhD Thesis: CST-134-97.
Silleck, B. (1996). Cosmic voyage. 35 minute film, a presentation of the Smithsonian Institution’s National Air and Space Museum and the Motorola Foundation.
Song, D. and Norman, M. (1994). Looking in, looking out: Exploring multi-scale data with
virtual reality. Comput. Sci. Eng., 1(3):53–64.
Williams, R. E., Blacker, B., Dickinson, M., Dixon, W. V. D., Ferguson, H. C., Fruchter, A. S.,
Giavalisco, M., Gilliland, R. L., Heyer, I., Katsanis, R., Levay, Z., Lucas, R. A., Mcelroy,
D. B., Petro, L., Postman, M., Adorf, H.-M., and Hook, R. (1996). The hubble deep field: Observations, data reduction, and galaxy photometry. Astronomical Journal, 112:1335–1389.
15