* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Quantum Communication: A real Enigma
Bell test experiments wikipedia , lookup
Bohr–Einstein debates wikipedia , lookup
Particle in a box wikipedia , lookup
Renormalization group wikipedia , lookup
Path integral formulation wikipedia , lookup
Quantum electrodynamics wikipedia , lookup
Hydrogen atom wikipedia , lookup
Quantum fiction wikipedia , lookup
Density matrix wikipedia , lookup
Copenhagen interpretation wikipedia , lookup
Quantum field theory wikipedia , lookup
Measurement in quantum mechanics wikipedia , lookup
Quantum computing wikipedia , lookup
Renormalization wikipedia , lookup
Coherent states wikipedia , lookup
Many-worlds interpretation wikipedia , lookup
Symmetry in quantum mechanics wikipedia , lookup
Quantum group wikipedia , lookup
Scalar field theory wikipedia , lookup
Quantum machine learning wikipedia , lookup
Topological quantum field theory wikipedia , lookup
Bell's theorem wikipedia , lookup
Orchestrated objective reduction wikipedia , lookup
Interpretations of quantum mechanics wikipedia , lookup
Quantum state wikipedia , lookup
History of quantum field theory wikipedia , lookup
Quantum entanglement wikipedia , lookup
EPR paradox wikipedia , lookup
Canonical quantization wikipedia , lookup
Quantum channel wikipedia , lookup
Quantum key distribution wikipedia , lookup
Quantum Shannon Theory
Patrick Hayden (McGill)
http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt
17 July 2005, Q-Logic Meets Q-Info
Overview
Part I:
What is Shannon theory?
What does it have to do with quantum
mechanics?
Some quantum Shannon theory highlights
Part II:
Resource inequalities
A skeleton key
Information (Shannon) theory
A practical question:
A mathematico-epistemological question:
How to best make use of a given communications
resource?
How to quantify uncertainty and information?
Shannon:
Solved the first by considering the second.
A mathematical theory of communication [1948]
The
Quantifying uncertainty
Entropy: H(X) = - x p(x) log2 p(x)
Proportional to entropy of statistical physics
Term suggested by von Neumann
(more on him soon)
Can arrive at definition axiomatically:
H(X,Y) = H(X) + H(Y) for independent X, Y, etc.
Operational point of view…
Compression
Source of independent copies of X
X
…n
X21X
If X is binary:
0000100111010100010101100101
About nP(X=0) 0’s and nP(X=1) 1’s
{0,1}n: 2n possible strings
2nH(X) typical strings
Can compress n copies of X to
a binary string of length ~nH(X)
Quantifying information
H(X)
Uncertainty in X
when value of Y
is known
H(X|Y)
H(X,Y)
I(X;Y)
H(Y)
H(Y|X)
H(X|Y) = H(X,Y)-H(Y)
= EYH(X|Y=y)
Information is that which reduces uncertainty
I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y)
Sending information
through noisy channels
´
Statistical model of a noisy channel:
m
Encoding
Decoding
m’
Shannon’s noisy coding theorem: In the limit of many uses, the optimal
rate at which Alice can send messages reliably to Bob through is
given by the formula
Shannon theory provides
Practically speaking:
Conceptually speaking:
A holy grail for error-correcting codes
A operationally-motivated way of thinking about
correlations
What’s missing (for a quantum mechanic)?
Features from linear structure:
Entanglement and non-orthogonality
Quantum Shannon Theory
provides
General theory of interconvertibility
between different types of
communications resources: qubits,
cbits, ebits, cobits, sbits…
Relies on a
Major simplifying assumption:
Computation is free
Minor simplifying assumption:
Noise and data have regular structure
Quantifying uncertainty
Let = x p(x) |xihx| be a density operator
von Neumann entropy:
H() = - tr [ log ]
Equal to Shannon entropy of eigenvalues
Analog of a joint random variable:
AB describes a composite system A B
H(A) = H(A) = H( trB AB)
Compression
Source of independent copies of AB:
No statistical assumptions:
Just quantum mechanics!
…
(aka typical subspace)
A
B
A
B
A
B
Bn
dim(Effective supp of B n ) ~ 2nH(B)
Can compress n copies of B to
a system of ~nH(B) qubits while
preserving correlations with A
[Schumacher, Petz]
Quantifying information
H(A)
Uncertainty in A
when value of B
is known?
H(AB)
H(B)
H(B|A)
H(A|B)
H(A|B)= H(AB)-H(B)
H(A|B) = 0 – 1 = -1
|iAB=|0iA|0iB+|1iA|1iB
B = I/2
Conditional entropy can
be negative!
Quantifying information
H(A)
Uncertainty in A
when value of B
is known?
H(A|B)= H(AB)-H(B)
H(A|B)
H(AB)
I(A;B)
H(B)
H(B|A)
Information is that which reduces uncertainty
I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0
Data processing inequality
(Strong subadditivity)
Alice
AB
I(A;B)
U
I(A;B)
I(A;B) ¸ I(A;B)
Bob
time
Sending classical information
through noisy channels
Physical model of a noisy channel:
(Trace-preserving, completely positive map)
m
Encoding
( state)
Decoding
(measurement)
m’
HSW noisy coding theorem: In the limit of many uses, the optimal
rate at which Alice can send messages reliably to Bob through is
given by the (regularization of the) formula
where
Sending classical information
through noisy channels
m
Encoding
( state)
Decoding
(measurement)
2nH(B)
2nH(B|A)
X1,X2,…,Xn
2nH(B|A)
2nH(B|A)
m’
Bn
Sending quantum information
through noisy channels
Physical model of a noisy channel:
(Trace-preserving, completely positive map)
|i 2 Cd Encoding
(TPCP map)
Decoding
(TPCP map)
‘
LSD noisy coding theorem: In the limit of many uses, the optimal
rate at which Alice can reliably send qubits to Bob (1/n log d) through
is given by the (regularization of the) formula
where
Conditional
entropy!
Entanglement and privacy:
More than an analogy
x = x1 x2 … xn
p(y,z|x)
y=y1 y2 … yn
z = z1 z 2 … z n
How to send a private message from Alice to Bob?
Sets of size 2n(I(X;Z)+)
All x
Random 2n(I(X;Y)-) x
Can send private messages at rate I(X;Y)-I(X;Z)
AC93
Entanglement and privacy:
More than an analogy
|xiA’
UA’->BE n
|iBE = U n|xi
How to send a private message from Alice to Bob?
Sets of size 2n(I(X:E)+)
All x
Random 2n(I(X:A)-) x
Can send private messages at rate I(X:A)-I(X:E)
D03
Entanglement and privacy:
More than an analogy
x px1/2|xiA|xiA’
UA’->BE n
x px1/2|xiA|xiBE
How to send a private message from Alice to Bob?
All x
Random 2n(I(X:A)-) x
Sets of size 2n(I(X:E)+)
H(E)=H(AB)
SW97
Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E) D03
Notions of distinguishability
Basic requirement: quantum channels do not increase “distinguishability”
Fidelity
F(,)={Tr[(1/21/2)1/2]}2
F=0 for perfectly distinguishable
F=1 for identical
F(,)=max |h|i|2
F((),()) ¸ F(,)
Trace distance
T(,)=|-|1
T=2 for perfectly distinguishable
T=0 for identical
T(,)=2max|p(k=0|)-p(k=0|)|
where max is over POVMS {Mk}
T(,) ¸ T((,())
Statements made today hold for both measures
Conclusions: Part I
Information theory can be generalized to
analyze quantum information processing
Yields a rich theory, surprising conceptual
simplicity
Operational approach to thinking about
quantum mechanics:
Compression, data transmission, superdense
coding, subspace transmission, teleportation
Some references:
Part I: Standard textbooks:
* Cover & Thomas, Elements of information theory.
* Nielsen & Chuang, Quantum computation and quantum information.
(and references therein)
Part II: Papers available at arxiv.org:
* Devetak, The private classical capacity and quantum capacity of a
quantum channel, quant-ph/0304127
* Devetak, Harrow & Winter, A family of quantum protocols,
quant-ph/0308044.
* Horodecki, Oppenheim & Winter, Quantum information can be
negative, quant-ph/0505062