Download Quantum_Computing

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Algorithmic cooling wikipedia , lookup

Matter wave wikipedia , lookup

Topological quantum field theory wikipedia , lookup

Wave–particle duality wikipedia , lookup

Renormalization wikipedia , lookup

Renormalization group wikipedia , lookup

Bell test experiments wikipedia , lookup

Basil Hiley wikipedia , lookup

Double-slit experiment wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

Delayed choice quantum eraser wikipedia , lookup

Bohr–Einstein debates wikipedia , lookup

Relativistic quantum mechanics wikipedia , lookup

Scalar field theory wikipedia , lookup

Measurement in quantum mechanics wikipedia , lookup

Particle in a box wikipedia , lookup

Density matrix wikipedia , lookup

Probability amplitude wikipedia , lookup

Max Born wikipedia , lookup

Quantum field theory wikipedia , lookup

Quantum electrodynamics wikipedia , lookup

Copenhagen interpretation wikipedia , lookup

Path integral formulation wikipedia , lookup

Bell's theorem wikipedia , lookup

Quantum decoherence wikipedia , lookup

Quantum dot wikipedia , lookup

Hydrogen atom wikipedia , lookup

Coherent states wikipedia , lookup

Quantum entanglement wikipedia , lookup

Quantum fiction wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Many-worlds interpretation wikipedia , lookup

Orchestrated objective reduction wikipedia , lookup

EPR paradox wikipedia , lookup

Interpretations of quantum mechanics wikipedia , lookup

History of quantum field theory wikipedia , lookup

Quantum group wikipedia , lookup

Quantum key distribution wikipedia , lookup

Quantum teleportation wikipedia , lookup

Quantum machine learning wikipedia , lookup

Canonical quantization wikipedia , lookup

Quantum state wikipedia , lookup

Quantum cognition wikipedia , lookup

Hidden variable theory wikipedia , lookup

Quantum computing wikipedia , lookup

T-symmetry wikipedia , lookup

Transcript
Quantum Computing: A 20 Year Prospectus
Stefan Berteau
Abstract
Quantum computing is a very promising alternative computer architecture which is currently
being extensively studied. It promises many unique advantages over classical computers,
particularly over silicon-based computers. Due to its unique nature and highly experimental
status, however, quantum computing would most likely be relegated to specialized roles even if
it were suddenly achieved today.
Contents
1. Introduction
2. The Future of Silicon-based Computing
2.1 Moore’s First Law
2.2 Lithographic Challenges
2.3 Physical Challenges
2.4 Conclusions Regarding the Future of Silicon-based Computing
3. Overview of Quantum Computing
3.1 Definition of a Quantum Computer
3.2 Similarities between QTMs and CTMs
3.3 Primary Differences between QTMs and CTMs
3.3.1 Coherent Superposition
3.3.2 Entanglement
3.4 Algorithm Creation
4. Advantages of Quantum Computing
4.1 Small Size of Qubits
4.2 Massive Parallelism
4.3 Computationally 'Hard' Problems
4.4 Uncomputable Problems
4.5 Conclusions Regarding the Advantages of Quantum Computing
5. Problems with Quantum Computing
5.1 Decoherence
5.2 Speed of Algorithms
5.3 Determination of Quantum Algorithms
5.4 Cost
5.5 Conclusions Regarding the Disadvantages of Quantum Computing
6. Suggested Specialized Roles of Quantum Computing
6.1 Signal Processing
6.2 Cryptography
6.3 Neural Networks
6.4 Quantum Graphics Acceleration
7. Conclusion
1. Introduction
As we begin the 21st century, many researches begin to look for ways to replace or augment
traditional silicon-chip-based computing. Not only do some alternatives offer the promise of
continued speed increases after the physical limits of silicon and light-based processes have been
reached, but several offer entirely new modes of computation. For the first time, however,
computational models are being examined that depart entirely from the way computers have
been calculating since Babbage laid out the plans for his Analytical Engine. In particular,
quantum computing can theoretically allow for the solution of computationally 'Hard' problems
in a polynomial time, as well as potentially solving previously uncomputable problems.
Quantum computing, however, is not in a position to replace silicon-based computing anytime
within the near future, and certainly not within the next 20 years, which is the time period
covered by this prospectus. This paper will examine the future of silicon-based computing, as
well as the merits and drawbacks of quantum computing. It will show that even if practical
universal quantum computers are created within the next two decades, and silicon-based
computers reach their limits at the earliest possible predictions, quantum computing will not have
enough of a computing advantage to outweigh the cost of replacing silicon computers in any but
certain very specialized applications. Finally, this paper will examine alternatives for the future
of quantum computing, examining roles that it is likely to fill and suggesting new areas that may
someday benefit from the use of quantum computing.
2. The Future of Silicon-based Computing
Information, by its nature, is physical. While this may initially seem counterintuitive, the
requirement of a physical medium to represent and/or transmit information is unavoidable.
Everything from written text to the information stored in our brains utilizes physical storage, and
the storage is altered as the information is altered, whether it is hoisting a flag to declare loyalty
to a cause or nation, or changing the voltage in a circuit. Given that storage is necessary, and a
physical alteration must be made in order to affect a change in information, the fundamental laws
of thermodynamics suggest an inherent limit to the amount of computing power, under any
architecture, that can be achieved in a finite space.1 Fortunately, the limit imposed by
thermodynamics is not likely to constrain us any time soon. Unfortunately, however, the reason
that limit will not constrain us is because our current method of computing suffers from much
tighter limitations; limitations which must be circumvented or overcome through a continued,
all-out push for new techniques to keep making our chips smaller and faster.
2.1 Moore's First Law
One of the best predictors of the future of silicon-based computing is Moore's First Law.
Simply stated, it is a law of economics which predicts that the number of transistors on a chip
will quadruple every 3 years. This law sets the speed of chip development at an almost
breakneck pace.
Moore's law has held, however, due to a constant stream of innovations from chip manufacturers.
2.2 Lithographic Challenges
Silicon chips are produced through what is essentially a lithographic process. A chip is coated
in photoreactive material, exposed to a laser through what is essentially a 'negative' of a circuit,
and then developed. Problems, however, begin to appear when the circuit elements start to
approach the wavelength of visible light. As of 2001, current chip-printing techniques were
expected to reach their physical limits of .1 micron within a few years. Researchers from a
consortium of industry and government labs, however, have successfully created a prototype
system which uses Extreme Ultraviolet Light (EUV), which has a short enough wavelength to
allow chip manufacturers to create elements of only .03 microns. This is only a very temporary
solution, however, as Moore's law is expected to surpass the limits of EUV before the year
2020.2
2.3 Physical Challenges
The technology used to produce silicon has always overcome obstacles, but some remarkably
large ones are rapidly approaching. According to Lucent Technologies, within the distant, yet
still foreseeable future, "Silicon dioxide may also need replacement as a gate dielectric
material...this critical feature of a transistor will be so thin that the quantum mechanical effect
called tunneling comes into play."3 Quantum tunneling is a phenomenon wherein individual
particles, such as electrons, can spontaneously cross barriers that according to Newtonian physics
they should not be able to pass through. At a scale where this could occur, just one or two
electrons tunneling through a transistor gate could cause it to misfire.4
Another problem that chip-manufacturers will encounter is the fact that at any size smaller than
50 nanometers, it becomes nearly impossible to control the behavior of a feature on the chip.
"On that scale, the mass-production processes would require atomic-scale control for precise
distribution of dopant elements - not layer by layer, but nearly atom by atom."5
2.4 Conclusions Regarding the Future of Silicon-based Computing
Moore's Law has driven silicon-based classical computing at a steady rate of progression since it
was first stated, but this pace cannot be kept forever while continuing to use silicon-based chips.
Whether in 20 years, when our current lithography technology reaches its limits, or later than
that, at some point or another the drive of Moore's law will supply the demand for faster chips,
and silicon will not be able to meet it. At that point, the market will be open for another
computing architecture to begin replacing silicon. Unless the new architecture can be made
backwards compatible with the instruction sets of silicon chips, the replacement will take some
time.
3. Overview of Quantum Computing
This paper will now proceed to give a brief survey of the subject of quantum computing,
particularly on the most important similarities and differences between quantum computing and
traditional transistor-based computing.
3.1 Definition of a Quantum Computer
In 1985, Deutsch put forth ideas behind the following definition of a quantum computer, which
is quoted from Stean6
A quantum computer is a set of n qubits in which the following operations are
experimentally feasible:
1. Each qubit can be prepared in some known state |0>.
2. Each qubit can be measured in the basis {|0>,|1>}
3. A universal quantum gate (or set of gates) can be applied at will to any
fixed-size subset of qubits.
4. The qubits do not evolve other than via the above transformations
The term qubit refers to the quantum equivalent of a binary bit in classical computing, that is, a
particle with some attribute which can be prepared to one state, altered between that state and
another state, and read to determine the state it is in.
3.2 Similarities between QTMs and CTMs
Since both quantum and classical computers are both dealing with binary processing systems,
there are certain distinct similarities between the two. For one thing, logical gates control the
change in information, and strings of these gates allow for circuits, which can be utilized in
algorithms. Both a Quantum Turing Machine (QTM) and a Classical Turing Machine (CTM)
are universal computers, theoretically able to duplicate any discrete-state system, given sufficient
time and memory. Finally, as per the Copenhagen Interpretation of Quantum Theory,
Input/Output for both is in a classical form.7
3.3 Primary Differences between QTMs and CTMs
However much the concepts used to control qubits may resemble those of classical computing,
the fact remains that qubits are of a drastically different nature than transistors within a CTM.
The two primary differences, those that give the quantum computer its unusual characteristics,
are Superposition and Entanglement.8
3.3.1 Coherent Superposition
Coherent Superposition is the remarkable ability of a qubit to enter a sate where it
simultaneously occupies the configuration |0> and the configuration |1>. This is an odd concept
to those who inhabit a world of classical physics, thus it is perhaps best explained using the idea
of a classical probabilistic computation tree. What follows is a tree where each node shows a
state of the Probabilistic Turing Machine (PTM). The initial, or highest, node represents the
initial configuration of the macine, and each level down is a state reachable in one step from its
parent node in the level above it. In addition, the same state can appear on the tree more than
once, to be reached by different paths.
On this tree, each connection from parent to child is associated with a probability that, the
machine being in the parent state, it will upon its next step enter that particular child state.
Thus, if it was in state B, it would have a 20% chance of entering state D. A necessary
constraint is that the total probabilities leaving one parent node cannot vary, but must always
sum to 1. That being said, however, two parent nodes in the same layer will have connection
probabilities that sum to 2, since the sum of probabilities only applies between a parent node and
it's children. Thus, this computation tree is considered local.9
This next tree shows the states of a QTM, and again each node represents a state that can be
reached in one step from its parent node, however this time the connections are not directly
associated with probability, but rather with the amplitude.
The probability of traversing a connection is determined as the square of the sum of the
amplitudes of all the occurances of that configuration or state in that particular layer. Since it is
the square of the sum, and not the other way around, the layer with two state D's, one having an
amplitude of .5, and the other one having an amplitude of -.5, produces a probability of 0
whenever there is a chance of entering a state D. This way that different branches of a tree
influence each other is an example of interference, or superposition. Unlike the PTM's state, the
state in a QTM is not considered local. Superposition means that until a child state is chosen,
the state of the QTM is simultaneously the entire level in the tree, not just a single state in that
level. Thus, the total probabilities of an entire level must sum to one, so this computation tree is
considered to be non-local.10
Superpositions are denoted as follows.
We say that at any step i, the computation is in a superposition of all the configurations
|c1>,...,|ck> corresponding to nodes that appear in level i of the tree representing the
computation, each |cj> having amplitude aj. (Borrowing quantum mechanics notation,
we distinguish symbols representing configurations from those representing amplitudes
by placing |> brackets around configuration symbols.) An abbreviated notation for this
superposition is j aj|cj>.11
3.3.2 Entanglement
The other important property of quantum particles with regards to quantum computing is
entanglement. Ironically enough, entanglement in its most basic form is the idea that it is
possible for a system of particles to not have their own individual properties. Simply put, there
are certain states wherein two quantum particles possess properties that are each relative to the
properties of the other. If both are measured and they are not existing in coherent superposition,
then the experimental results make it look like they are simply two particles, each with their own
spin or other property. If the particles do exist in coherent superposition, however, it quickly
becomes apparent that no matter which particle is measured, the other particle will always exist
in a state which can only be defined relative to the state of the first particle.12
The reason why entanglement is so important to quantum computers is that it allows for the
universal quantum gate that is called for in the definition to be applied to an arbitrary subset of
the total set of qubits in the computer. Through entangling several qubits and thereby making
their values relative to each other (though not changing an actual value of a qubit due to the fact
that they are in coherent superposition and therefore represent all possible values), a single
operation can be simultaneously performed on multiple qubits.
3.4 Algorithm Creation
There are two types of algorithms that a QTM uses. The first, and most common type, is a
simulated classical algorithm. In this case, the QTM takes advantage of its abilities as a
universal computer and simulates the actions of a classical computer. These algorithms allow
for simple operations such as addition, multiplication, etc.
The other type of algorithm is the true quantum algorithm. This type of algorithm does not
simulate the actions of a classical computer. In fact, a CTM would be potentially unable to
simulate what occurs during the execution of a quantum algorithm. These algorithms are
difficult to work out, and are tailored to very specific uses, such as computationally 'Hard'
problems. Not only must a suitable algorithm be found that can take advantage of the
superposition of qubits, but because of the quantum-mechanical nature of their operations, a
method must be found to then determine which subset of the set of all possible results which is
generated is the 'correct' result. This often involves running the algorithm multiple times, which
despite seeming like a slow idea, still offers a great advantage in being a linear increase in time,
as opposed to the exponential one that a non-quantum algorithm would be experiencing.
4. Advantages of Quantum Computing
4.1 Small Size of Qubits
Not only are qubits already much smaller than the scale of 50nm that we may not be able to pass
in classical silicon computing, but the quality of superposition allows them to simultaneously
represent all possible values. Thus, according to Steane13, "a quantum system can be said to
have n qubits if it has a Hilbert space of 2^n dimensions, and so has available 2^n mutually
orthogonal quantum states." What this means in practical terms is that while a classical
computer can use 4 bits to represent 4 numbers of 1 bit, two numbers of 2 bits, or one number of
4 bits length, a quantum computer can use 4 qubits to represent 16 numbers of 4 bits length, or
64 numbers of 1 bit length. Thus, computational ability as well as data storage increases
exponentially as qubits are added to the system.
4.2 Massive Parallelism
The other important effect that superposition has on quantum computing is that while a set of n
qubits is representing 2^n quantum states, the n qubits can be operated upon, which then
produces the superposition of all possible results. In this manner, quantum computers are
actually SIMD computers under Flynn's Classification. They are just exponentially faster than
classical computers.
4.3 Computationally 'Hard' Problems
A computationally 'Hard' problem is one for which, under classical computing architecture, the
time t required to calculate an answer increases exponentially in relation to the complexity of the
problem. These problems have confounded computer scientists for decades, if not more than a
century. Even as far back as the Analytical Engine which was never built, virtually all
computers have functioned on the same underlying principles.14 Vast improvements have been
made in size, speed, and storage capacity, but all of these improvements have been linear.
Because of this, there will always come a point where the exponentially growing 'Hard' problem
will quite suddenly be far beyond your reach. Quantum algorithms, on the other hand, have
been found which allow a QTM to perform several computationally 'Hard' problems in
polynomial time. The classic example, and 'killer app' for quantum computers, is Shor's
factorization algorithm. In that factorization can be mathematically shown to be related to the
period of a function, the algorithm for factoring numbers becomes what is basically a quantum
algorithm for finding the period of a function.
Suppose a function f(x) is periodic with period r, i.e. f(x) = f(x+r). Suppose further that
f(x) can be efficiently computed from x, and all we know initially is that N/2 < r < N for some
N.15
A classical computer is left calculating on the order of N/2 values for x, the same order of
magnitude as the number of divisions needed by a more 'brute-force' approach to factoring.16
That is where the computational 'Hard' part comes in. In theory, however, it can be quickly
solved by a quantum computer implementing a quantum network, then performing a final
Fourier transformation in the network can be viewed as a sort of interference between the various
superposed states in the x register, similar to the action of a diffraction grating.17cd
4.4 Uncomputable Problems
The other set of problems which quantum computing excels at is the traditionally uncomputable
problems. Hilbert's tenth problem, for example, which Tien D Kieu paraphrased as
Given any polynomial equation with any number of unknowns and with integer
coefficients: To devise a universal process according to which it can be determined by a finite
number of operations whether the equation has integer solutions.18
The solution of this problem requires the solution of the Turing halting problem, which is the
problem of being able to tell in advance whether a Turing machine will find its arbitrary input
during a processing step to fit the domain defined for an arbitrary partial recursive function or
not. But a quantum function can exist to determine whether a program p will halt on input i.19
Thus, for an example equation (x + 1)^3 + (y + 1)^3 - (z + 1)^3 + cxyz = 0,
cZ,a
Hamiltonian corresponding to the above equation is constructed, and then the ground state |g> of
the Hamiltonian so constructed has the properties
Nj|g>,
Hp|g> = ((nx + 1)^3 + (ny + 1)^3 - (nz + 1)^3 + c nx ny nz)^2 |g> º Eg|g>, *= has 3 lines
for some (nx, ny, nz).20
Then a projective measurement of the observables corresponding to the operators Nj is taken. If
there is one unique solution, it will simply yield some values (nx, ny, nz). If there are multiple
solutions, the ground state |g> will be a linear superposition of states of the form
|nx>|ny>|nz>.21
4.5 Conclusions Regarding the Advantages of Quantum Computing
Quantum computing has many advantages over classical computing, however all but the size of
qubits tends to focus on a rather small set of problems. For any application which required the
specific abilities granted by a quantum computer, the advantages are such that no other
computing system could possibly suffice. With regards to everyday use, however, if a classical
replacement for silicon-based chips came along with size that allowed moore's law to continue
for a decent period of time after silicon stopped being a viable option, it would almost certainly
display as many advantages as quantum computing for the average user.
5. Problems with Quantum Computing
Quantum computing derives its unique advantages from the fact that its smallest computational
entities simply do not work in the same manner as any classical system. This can offer some
equally unique challenges as well.
5.1 Decoherence
One major technical problem, which researchers have been trying to deal with since the first
attempts to build a working quantum computer, is Decoherence. Decoherence is the effect of
interaction between a quantum system in coherent superposition and it's surrounding
environment. Just as measuring a quantum system locks it into a set condition, a collision with
any other particles can do the same thing. As Smith points out, any stored information,
anywhere, "will not remain in a time-invariant quantum state forever."
With particular
reference to quantum computing, he notes that "two quantum systems cannot be perfectly
isolated from one another since one cannot build insurmountable potential barriers."22
Zurek23 has estimated the decoherence time of a quantum complex superposition having spacial
extent x and mass m as
Where T is the temperature of the surrounding vacuum, and relax is the classical "relaxation
time" for the system.
Thus, if as Smith suggests, a working quantum computer at 1Kelvin and in a 1 micron sphere
used 1000 molecules of small protein mass, achieving m = 10^6 AMU, the equation (according
to Smith24) becomes
However, if instead of protein molecules, individual atoms are used, as is the case in the
experimental ion traps, the equation changes. For example, the ion trap at Oxford uses calcium
ions25, with an atomic weight of 40 AMU, which would yield a total mass of 4x10^4. The
calculation of this change in mass on decoherence time is unfortunately beyond the scope of this
paper, but it does offer the potential to greatly increase the time until decoherence.
In addition, certain methods have been proposed which would lessen or negate some of the
effects of decoherence in a quantum system. It was initially assumed that the quantum
no-cloning theorem26 prevented error correcting in quantum systems, as any attempt to create
redundancy would destroy the initial copy. However, this obstacle has been overcome by Shor
in his "Scheme for reducing decoherence in quantum memory" In Shor's scheme, a code is
developed that copied one qubit into nine other qubits so that even if one of the nine qubits
decohered, the value could be recovered from another of the nine.27 In addition, Shor has worked
with Calderbank to produce even more efficient error-correcting codes. The workings of these
codes are beyond the scope of this paper, but their effect is theoretically to allow the encoding of
"k qubits into n qubits that correct t errors and have asymptotic rate 1-2H2 (2t/n) as n -> ¥"28
5.2 Speed of Algorithms
As is the case with most SIMD machines, the parallelism which is so efficiently achieved in
quantum computers is only useful in certain situations. Quantum computers could be incredibly
fast at specific applications, and SIMD architecture is useful in one particular everyday
application, which is 3D graphics in games and other applications. However, the majority of
algorithms in use on personal computers today are specifically tailored for serial processing. It
is possible for many of today's algorithms to be improved, but it will be necessary to retrain our
software developers, as well as rework all our existing code bases. The time necessary to make
this change ensures that while quantum computers may enter the market within the next 20 years,
it will be a slow entry, and only in the areas where the particular properties of quantum
computers are needed enough to justify the time and expense of converting to them.
5.3 Determination of Quantum Algorithms
Quantum Algorithms are primarily aimed at solving problems that are classically 'Hard' or
uncomputable. While the use of quantum computers to solve these problems can have many
advantages, particularly in the predicted ability of quantum algorithms to revolutionize
encryption, there are a very limited number of these problems, estimated in the range of two to
ten.29 Furthermore, most of these problems have very little use in daily life, beyond certain very
specific applications. Thus, one of the most unique abilities of quantum computers, while
important in certain roles, would contribute very little if quantum processors were brought in as
replacements for silicon chips. This offers no actual drawback to users other than paying for
abilities that they do not directly need...thus it is more likely to slow the advent of quantum
computing rather than offer a threat of stopping it.
5.4 Cost
There is no set price for constructing a quantum computer, as currently there are no quantum
computers. When one is created, however, it is a safe to assume that it will require cutting-edge
technology, which generally comes with a hefty price tag. While technology tends to drop in
price fairly rapidly after being introduced, the price would be a major limitation to the market for
quantum computers over at least the first few years of their existence, if not for longer.
5.5 Conclusions Regarding the Disadvantages of Quantum Computing
Decoherence is by far the largest impediment to the realization of quantum computing. The
relative speed of algorithms and the difficulty in generating new quantum algorithms are both
problems that will require attention, but neither one is of a level that it could potentially prevent
quantum computing from being achieved. Decoherence, on the other hand, threatens to do just
that. Shor and many other researchers are addressing this issue, however, and if a reliable
quantum error-correcting solution can be implemented, then quantum computing can become a
reality.
Once reliable quantum computing is acheived, however, the speed and difficulty of algorithms
will become more important. These factors will make development for quantum computers
more difficult, and when combined with what will likely be a high cost, will limit the willingness
of businesses to adopt a quantum platform.
6. Suggested Specialized Roles for Quantum Computing
While quantum computing may not be in a position to begin replacing classical computing any
time in the near future, the advantages it displays over classical architectures are too great to be
totally ignored. Therefore, it seems most likely that quantum computing will fall into
specialized roles where they can fully utilize their unique advantages. These roles will almost
certainly include signal processing and cryptography, but they may also be extended into other
fields such as neural networks or even graphics acceleration.
6.1 Signal Processing
At the heart of Shor's algorithm lies the ability of a quantum computer to perform a discrete
Fourier transform.30 The basic premise behind a Fourier transform is the idea that any waveform
can be represented as the sum of several sinusoidal waveforms. Because the discrete Fourier
transform involves decomposing a waveform into its component sinusoids, it is central to many
aspects of signal analysis, and it has already been mathematically demonstrated that the quantum
algorithms for Fourier transforms would be a linear-time problem, whereas classically, they
represented an exponetnial-time problem.31 In addition to Fourier transforms, algorithms have
also been put forward to utilize the advantages of quantum computers when computing discrete
Cosine transforms, discrete Sine transforms, and the discrete Hartley transform.32
6.2 Cryptography
The application which has generated more interest in quantum computing than any other has
been quantum cryptography. Most forms of modern cryptography hinge on the fact that
brute-force decryption requires the factorization of very large numbers. This brute-force
factorization would take an unworkable amount of time using classical computers, but thanks to
Shor's algorithm,33 quantum computers could cut the amount of time from exponential to linear,
thus completely redefining the field and making a quantum computer a necessity for any
organization involved in cryptography.
6.1 Neural Networks
The science of neural networks lends itself very easily to massively parallel computing, as it
requires huge numbers of Threshold Logic Units (TLUs) which must each evauluate all their
inputs, multiply them by the input weights, and then compare the sum of all weighted inputs to
it's threshold, determining whether or not it fires and passes input to other TLUs.34 In addition
to the many TLUs that must all be updated, quantum computing could potentially be used in the
initial design of neural networks, as the weightings of each connection effectively create a
hyperplaner surface which can curve in many directions, separating responses to various inputs.
In a widely interconnected network, as it is designed, the number of connections undergoes
exponential growth as the number of TLUs increases. Locating the right combination of
weights for the exponentially increasing possibilities would be a task very well suited to a
quantum computer.
6.3 Quantum Graphics Acceleration
The single most important problem area in the creation of 3D graphics is the solution of the
integral rendering equation, first set out by Kajiya.35 This equation is expressed as a 3-point
transport equation, detailing the transfer of light energy along a path from it's projecting source,
through it's reflection off a second surface, and finally to the receiving surface. There are many
ways to resolve this equation, but up-to-date all the usable approaches have had to be
compromises, due to the fact that a strict finite element analysis of the formula leads to an
exponential growth in the matrix generated compared to growth in the environment.36 (A
Heirarchical Illumination Algorithm, 1) These characteristics, the required evaluation of
arbitrary points in a formula, along with the exponential time to compute using classical
computing methods, suggest that if a quantum algorithm to resolve the rendering equation could
be found, it would allow for truly accurate illumination in 3D scenes within an acceptable time
limitation, which would be an amazing breakthrough for the many industries which rely on
computer-generated imagery.
7. Conclusion
The disadvantages of quantum computers, coupled with the fact that the greatest advantages of a
quantum computer can only be used for specific types of problems make the future of quantum
computing as a widespread, everyday architecture uncertain at best. Currently, for everyday
use, silicon will continue to hold for the time being. When it does die out, something else will
likely be the first replacement for it, except in the specialized areas discussed above, and any
others that have been able to leverage the theoretical advantages of quantum computing.
However, if Moore’s law continues to hold and the quest for size reduction races on, then one
day there will be no choice but to turn to quantum computing, as by the time you reach a certain
size, anything you deal with will obey quantum mechanics.
Notes
1 Warren D. Smith, Fundamental physical limits on computation, available from ResearchIndex
<http://citeseer.nj.nec.com/smith95fundamental.html>, 29
2 Ron Kolb, EUV Lithography Making Possible Next Generation of Semiconductors, available
from Berkeley Lab sciencebeat
<http://www.lbl.gov/Science-Articles/Archive/euv_milestone.html>
3 Lucent Technologies, Barriers, turning points, or just more hurdles?
<http://www.lucent.com/minds/trends/trends_v4/04.html>
4 Ibid.
5 Ibid.
6 Andrew Steane, Quantum Computing, available from Los Alamos arXive.org e-print archive
<http://xxx.lanl.gov/>
7 Werner Heisenberg, Physics and Philosophy (Amherst, NY: Promethius Books, 1958, 1999),
44
8 R. Cleve and others, Quantum Algorithms Revisited, available from Los Alamos arXive.org
e-print archive <http://xxx.lanl.gov/>
9 Daniel R. Simon, On the Power of Quantum Computation, _Proceedings of the 35th IEEE
Symposium on Foundations of Computer Science_, 1994 pages 124--134, p. 126
10 Ibid, page 127
11 Ibid, pages 128-129
12 Quantum Theory: weird and wonderful, PhysicsWorld, December 1999, available from
PhysicsWeb <http://physicsweb.org/article/world/12/12/19>
13 Steane
14 Ibid.
15 Ibid.
16 Ibid.
17 Ibid.
18 Tien D Kieu, Quantum Algorithms for Hilbert's Tenth Problem, available from Los Alamos
arXive.org e-print archive <http://xxx.lanl.gov/>
19 Ibid.
20 Ibid.
21 Ibid.
22 Smith,22
23 Wojciech H. Zurek, Decoherence and the transition from quantum to classical. _Physics
Today_, October 1991, 36-44.
24 Smith, 31
25 The Andrew Steane and Derek Stacey Group Home Page, available from
<http://www.qubit.org/research/IonTrap/index.html>
26 W.K. Wooters and W.H. Zurek, A single quantum cannot be cloned, _Nature_ 802, 1982,
299.
27 A.R. Calderbank and Peter W. Shor, Good quantum error-correcting codes exist, available
from Los Alamos arXive.org e-print archive <http://xxx.lanl.gov/>
28 Ibid.
29 Dyankonov
30 Andreas Klappenecker and Martin Ro*umlat*tteler, On the Irresistible Efficiency of Signal
Processing Methods in Quantum Computing, available from Los Alamos arXive.org e-print
archive <http://xxx.lanl.gov/>
31 Simon, page 132
32 Klappenecker and Ro*umlat*tteler
33 Steane
34 Nils J. Nilsson, Artificial Intelligence: A New Synthesis (San Francisco: Morgan Kaufmann
Publishers, Inc, 1998), 44
35 J.T. Kajiya, The rendering equation, _Computer Graphics_, 20 volume 4 , 1986, 143-150.
36 Larry Aupperle and Pat Hanrahan, A Hierarchical Illumination Algorithm for Surfaces with
Glossy Reflection, _Computer Graphics_, Annual Conference Series, volume 27, 1993,
155-162.