Download ppt - MIT

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Particle in a box wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

Measurement in quantum mechanics wikipedia , lookup

Hydrogen atom wikipedia , lookup

Quantum dot wikipedia , lookup

Copenhagen interpretation wikipedia , lookup

Bell's theorem wikipedia , lookup

Coherent states wikipedia , lookup

Path integral formulation wikipedia , lookup

Density matrix wikipedia , lookup

Quantum decoherence wikipedia , lookup

Quantum fiction wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Algorithmic cooling wikipedia , lookup

Many-worlds interpretation wikipedia , lookup

Orchestrated objective reduction wikipedia , lookup

History of quantum field theory wikipedia , lookup

Quantum entanglement wikipedia , lookup

EPR paradox wikipedia , lookup

Quantum electrodynamics wikipedia , lookup

Interpretations of quantum mechanics wikipedia , lookup

Quantum group wikipedia , lookup

Hidden variable theory wikipedia , lookup

Probability amplitude wikipedia , lookup

Quantum machine learning wikipedia , lookup

Quantum teleportation wikipedia , lookup

Quantum state wikipedia , lookup

T-symmetry wikipedia , lookup

Quantum computing wikipedia , lookup

Quantum key distribution wikipedia , lookup

Canonical quantization wikipedia , lookup

Transcript
Gentle tomography and
efficient universal data
compression
Caltech IQI
Charlie Bennett
Aram Harrow
Seth Lloyd
Jan 14, 2003
Classical data compression
• Asymptotically optimal: n copies of {pi} are
compressed to n(H({pi})+dn) bits with en
error where dn,en!0 as n!1.
• Computationally efficient: running time is
polynomial in n.
• Universal: algorithms work for any i.i.d.
source.
Example: Huffman coding
Map letter i to a prefix-free codeword of length -log pi.
0
A
1
0
1
B
0
C
1
D
Letter (i)
Probability (pi)
Codeword
A
B
C
D
1/2
1/4
1/8
1/8
0
10
110
111
The expected number of bits per symbol is -ipilogpi
=H(pi) and the standard deviation is O(pn).
Quantum Data Compression
• A quantum source r=jqj|yjihyj| can be diagonalized
r=ipi|iihi| and treated as a classical source.
• The main difference between quantum and classical
Huffman coding is that measuring the length of the
output will damage the state.
• Also, Schumacher compression assumes we know the
basis in which r is diagonal. Therefore it is optimal
and efficient, but not universal.
Universal quantum data
compression
• JHHH showed that for any R, there exists a space
Hn,R of dimension 2nR+o(n) such that if S(r)<R then
most of the support of rn lies in Hn,R. (quant-ph/9805017)
• HM showed that you don’t need to know the entropy.
However, they present no efficient implementation.
(quant-ph/0209124)
• JP give an efficient, universal algorithm, but we like
ours better. (quant-ph/0210196)
Goal: Efficient, Universal
quantum data compression
• Modeled after classical Huffman coding.
• Pass 1: Read the data, determine the type class
(empirical distribution) and decide on a code.
• Pass 2: Encode
Step 1: Gentle tomography
• Problem: Quantum state tomography damages the
states it measures.
• Solution: use weak measurements. For example, in
NMR the average value of sx can be approximately
measured for 1020 spins without causing
decoherence.
Gentle tomography algorithm
• Reduce tomography to estimating d2 observables
of the form tr rsk.
• For each estimate:
– Randomly divide the interval 0,…,n into n1/4 different bins.
– Measure only the bin that trrsk falls into.
n1/4 random bin boundaries
ntrrsk
n
0
uncertainty O(n1/2)
bin width O(n3/4)
Gentle tomography lemma
• If a projective measurement {Mj} has high probability
of obtaining a particular outcome k, then obtaining k
leaves the state nearly unchanged. (Ahlswede,
Winter)
• Thus, if we are very likely to estimate the state
correctly, then this estimation doesn’t cause very
much damage.
Implementation
Example of a circuit to gently estimate h1|r|1i.
…
|0ilog n
…
rn
+1
+1
+1
Weakly
Measure
-1
-1
-1
Result
• An information-disturbance tradeoff
curve: error¢disturbance>n-1/2 (up to
logarithmic factors).
• (Can we prove this is optimal?)
• In particular, we can set both error and
damage equal to n-1/4log n.
Step 2: Compression
• Given an estimate s with |r-s|<e, how
do we compress rn?
Huffman coding with an
imperfect state estimate
• Suppose we encode rn according to
s=iqi|iihi|.
• Codeword i has length -log qi and occurs with
probability hi|r|ii.
• Thus the expected length is i-log qi hi|r|ii =
-tr(rlogs) = S(r) + S(r||s).
• Unfortunately, S(r||s) is not a continuous
function of s.
Dealing with imperfect
estimates.
• Replace s with (1-d)s + dI/d.
• This makes the rate no worse than
S(r)+elog 1/ed.
Result
• A polynomial time algorithm that compresses
rn into n(S(r)+O(n-slog2n)) qubits with error
O(ns-1/2log n).
Other methods
• Schumacher coding and the JHHH method both
achieve S(r)+O(kn-1/2) qubits/signal and exp(-k2)
error.
• The HM method has roughly the same ratedisturbance trade-off curve, though their overflow
probability is lower.
• The JP method achieves error n-e and rate S(r)+n-r
when e=1/2+r(1+d2). For example, compressing
qubits with constant error gives rate S(r)+O(n-1/10).
Future directions
• Stationary ergodic sources. Likewise, a
visible quantum coding theory exists, but the
only algorithm is classical Lempel-Ziv.
• Proving converses:
– No exponentially vanishing error.
– No on-the-fly compression.
– Information/disturbance and
rate/disturbance trade-offs.
A Quantum Clebsch-Gordon
transform (Bacon & Chuang)
• Hn=©l`n Al Bl. l is a partition of n into d
parts, Al is an irreducible representation of
SU(d) and Bl is an irrep of Sn.
• Wanted: an efficient quantum circuit to map
|i1…ini!l|li|ali|bli.
• Useful for universal compression, state
estimation, hypothesis testing, and
entanglement
concentration/dilution/distillation.