Download A Reductionist view of Network Information Theory

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Distributed firewall wikipedia , lookup

Cracking of wireless networks wikipedia , lookup

Piggybacking (Internet access) wikipedia , lookup

Zero-configuration networking wikipedia , lookup

Computer network wikipedia , lookup

Network tap wikipedia , lookup

Airborne Networking wikipedia , lookup

Transcript
A Reductionist view of
Network Information Theory
Michael Langberg
SUNY Buffalo
1
Network Information Theory
• The field of network communication is a very rich
•
•
•
and intriguing field of study.
There has been great progress over the last
decades, on several communication scenarios.
Several problems remain open.
Studies may share at times analytical techniques,
however, to some extent, each new problem
engenders its own new theory.
Goal of unifying theory, that may explain the
commonalities and differences between
problemst
s
1
1
s
and solutions.
t
2
2
s
s
3
4
t
3
t
2 4
Towards a unifying theory
•
•
•
•
Individual studies focusing on specific problems have
been extremely productive.
Different perspective: a “conditional” study of
network communication problems.
Focus on connections: compare different comm.
problems through the lens of reductions.
We can connect between problems without explicitly
knowing either of their solutions.
s
s
s
s
1
2
3
4
N1
t
1
t
2
t
3
t
4
s
s
s
s
1
2
3
4
N2
t
1
t
2
t
3
t
4
3
Overview
• Reductions.
• Preliminaries: Network Coding.
• Simplifying the NC model.
• Is NC hard?
• Reliable and Secure communication.
• Can NC help solve other problems as well?
4
Reductions
• Definition.
• Example 1.
• Example 2.
• Example 3.
5
Index Coding/Network Coding.
Index Coding/Interference Alignment.
Multiple Unicast vs. Multiple Multicast NC.
Network Equivalence.
Secure Communication vs. MU NC.
Reliable Communication vs. MU NC.
2 Unicast vs. K Unicast NC.
Index Coding/Distributed storage.
…
This talk: reductive studies
•
•
•
•
•
•
•
Reductions can show that a problem is easy.
Reductions can show that a problem is hard.
Reductions allow propagation of proof techniques.
Study of reduction raise new questions.
Study of reductive arguments identify central problems.
Provides a framework for generating a taxonomy.
Have the potential to unify and steer future studies.
N1
N2
6
Noiseless networks: network coding
•
•
•
•
•
Directed network N.
Source vertices S.
Terminal vertices T.
Set of requirements:
•
Transfer information from Si to Tj.
S
1
S
2
T
3
T
T
1
2
Objective:
•
Design information flow that satisfies requirements.
7
Each Si transmits one of 2Rin messages.
Communication
S
1
S
2
S
3
S
4
T
T
R=(R1,…Rk) feasible: for all >0 exist n: (,n)-feasible.
Capacity: closure of all feasible R.
T
Communication at rate R = (R1,…,Rk) is achievable over
instance (N,{(si,ti)}i) with block length n if:
 random variables {Si},{Xe}:
•
•
•
•
•
•
T
1
2
3
4
Rate: Source Si = R.V. independent and uniform with H(Si)=Rin.
Edge capacity: For each edge e of cap. ce: Xe = R.V. in [2cen].
Functionality: for each edge e we have fe = function from incoming
R.V.’s Xe1,…,Xe,in(e) to Xe (i.e., Xe=fe(Xe1,…,Xe,in(e))). X1
f
Decoding: for each terminal Ti we define
a decoding function yielding Si.
e
X
2
X
e
X
3
Communication is successful with probability 1- over {Si}i:
R=(R1,…Rk) is ”(,n)-feasible” if comm. is achievable.
8
Examples
• Example 1.
• Example 2.
9
Index Coding
[Birk,Bar-Yossef et al.]
• IC is a special case of NC
• A set S of sources.
• A set T of terminals.
• Each terminal has some subset of sources (as side
info.) and wants some subset of sources.
• Broadcast link has capacity c .
•Other links have unlimited cap. s s s
• Objective: To satisfy all terminals.
B
1
2
3
s4
cB
using broadcast rate cB.
t1
t2
t3
t4
M-Multicat to M-Unicast
• [Dougherty Zeger]
• [Wong Langberg Effros]
• [Kamath Tse Wang]
11
Multiple Unicast Index Coding
• Third step: Reduce to Multiple Unicast
•
•
•
Network Coding [Dougherty Zeger].
Linear Index Coding [Maleki Cadambe Jafar].
[Langberg Effros]
General (noisy) networks including IC [Wong Langberg Effros].
Zero error
MU Index
Coding
NC
Index
Coding
MU Index
Coding
12
Simplifying topology
s1
s2
s1
s2
s3
s4
s5
s6
NC
IC
t1
t2
t1
t2
t3
t4
t5
t6
Theorem: For any NC, R one can construct IC, R’ such that
for any n: NC is (R,n)-feasible iff IC is (R’,n)-feasible.
• Step 1: Present reduction from NC to IC.
• Step 2: Equivalence for linear and general
encoding/decoding.
[EffrosElRouayhebLangberg].
[ElRouayhebSprintsonGeorghiades],
13
The reduction
NC sources
NC
NC edges
NC sources
X1
Network:
Xe
edges
IC
X2
X3
NC terminals
•Index Coding instance:
NC term.
NC edges
•Sources corresponding to NC sources, and NC edges.
•Terminals corresponding to NC term., NC edges, special terminal.
•For edge e: terminal t in IC wants IC source X and has as
e
side information all IC sources incoming to e in NC.
e
IC encodes topology of NC in its terminals!
Proof Technique by reduction:
Instance to hard problem  Network Coding instance
Solution to hard problem  Solution to NC problem
•Scalar Linear Coding
•
• Given 3-SAT instance 
[Lehman Lehman]
coding instance (G,R) such that:
construct network
• Associate 2 sources with each variable corr. to TRUE and FALSE.
• Single terminal with each clause.
• With each clause associate a subgraph and terminal requirements.
• For ( x  x  x )
j
k
l
[Lehman Lehman]
• Reduction works:  is satisfiable iff (G,R) is feasible.
20
What about approximately
finding capacity?
• Up to now: Finding Scalar-Linear NC that obtains
capacity is NP-hard.
• Question: Is it easy to find a Scalar Linear NC that
enables communication at rate 50% the capacity?
• NO! “Hard” to find a Scalar Linear NC that enables
communication within any constant factor of capacity.
• Main idea: Use Index Coding and connection to the
0.001%
clique cover [LS].
• Previous two constructions do not extend when trying to find
NC that approximately meet capacity.
22
Secure NC
23
This work
Up to now: well understood!
• Error correction in Network Coding.
• Objective: coding against jammer controlling links.
• Look at simple open problem.
• Single source, single terminal.
• Acyclic networks.
• All edges have unit capacities.
• Adversary controls single link.
• Some edges cannot be jammed.
• What is the communication rate?
s
t
24
Determining secure capacity is
as hard as determining the MU
network coding capacity.
Related example
• Similar setting was studied for wiretap
adversaries
•
•
•
[HuangHo LangbergKliewer; Chan Grant].
Well understood: Multicast; uniform links; with single source
generating randomness.
Not well understood: Multiple nodes generate randomness.
Consider simple setting:
• Single source/terminal; acyclic; uniform edge cap.; 1
wiretaped edge; any node can generate randomness:
25
s
Results
• Study: acyclic networks, single source, single
•
t
terminal, adversary controls single link, edges
have unit capacities; some edges cannot be
jammed.
Proof: by reduction
Show: computing capacity is as hard as computing
the capacity of Multiple Unicast Network Coding.
26
What next?
• Computing error correcting capacity is as hard as
computing the capacity of MU Network Coding.
• Present proof ideas for zero error communication.
• Subtleties for standard communication (asymptotic
error, asymptotic rate).
27
Zero error case
• Computing capacity is as hard as computing the
•
•
•
•
•
capacity of Multiple Unicast Network Coding.
Input: MU NC problem N.
Q: is rate tuple (1,1,…,1)
achievable w/ 0 error?
Reduction: construct new
network N’.
Can jam any single link except
links leaving s and entering t.
Thm: (1,1,…,1) achievable on N
iff rate k is achievable on N’.
N’
28
Zero error case
• Computing capacity is as hard as computing the
•
•
•
•
•
•
•
capacity of Multiple Unicast Network Coding.
Can jam any single link except
links leaving s and entering t.
Thm: (1,1,…,1) achievable on N
iff rate k is achievable on N’.
Assume (1,1,…,1) on N.
Source sends info. on links ai.
One error may occur.
Bi decodes based on majority.
Single error will not corrupt.
Rate k is possible on N’.
29
Corresponds to M1.
Corresponds to M2.
Zero error case
• Computing capacity is as hard as computing the
capacity of Multiple Unicast Network Coding.
•
•
•
•
•
Can jam any single link except links
leaving s and entering t.
Thm: (1,1,…,1) achievable on N iff rate
k is achievable on N’.
M
Assume rate k achievable on N’.
Want to show (1,1,…,1) on N.
Operating at full rate (cut set): 1-1
between message M; a1…ak; b1…bk
Claim (error correction): For M1≠M2,
if bi(M1)≠bi(M2) then:
• zi’(M1)≠zi’(M2).
30
Corresponds to M1.
Corresponds to M2.
• M1 transmitted + error on x1.
• M2 transmitted + error on y1.
Computing capacity is as hard
asvalue
computing
the
• Cut
is equal!
• B
capacity of Multiple Unicast
Network
Coding.
1 cannot distinguish
between M1 and M2.
Assume rate k achievable on N’.
Want to show (1,1,…,1) on N.
Operating at full rate (cut set): 1-1
between message M; a1…ak; b1…bk
Claim (error correction): For M1≠M2,
if bi(M1)≠bi(M2):
• zi’(M1)≠zi’(M2).
Assume otherwise: zi’(M1)=zi’(M2).
Consider 2 settings.
Terminal cannot distinguish between
M1 and M2.
1-1 correspondence between bi – z’i.
Zero error case
•
•
•
•
•
•
•
•
•
31
Zero error case
• Computing capacity is as hard as computing the
capacity of Multiple Unicast Network Coding.
•
•
•
•
•
•
•
•
Assume rate k achievable on N’.
Want to show (1,1,…,1) on N.
Operating at full rate: 1-1 between
message M; a1…ak; b1…bk
1-1 correspondence between bi – z’i
Same technique: 1-1 correspondence
between ai-xi-yi-zi
Also 1-1 correspondence bi- xi.
All in all: 1-1 between zi-xi-bi-z’i.
Implies connection zi-zi’: Multiple Uni.
32
Network equivalence
•
First explicit reductive paradigm to network
communication [Koetter Effros Médard].
Nin
Nout
N
“simple” network
•
“complex” network
“simple” network
“Simple” network : replace individual independent
memoryless components by corresponding noiseless
components (i.e., Network Coding).
33
Example: upper bound
•
Replace independent memoryless (noisy) components by
upper bounding noiseless components.
N
“complex” network
•
•
Nout
“simple” network
Replace noisy component
by Network Coding
component
.
Prove: any rate tuple R in capacity region of original
network is also in that of upper bounding network.
34
What is known?
Nevertheless: for point to point channels:
Preserving component-wise communication
•
Point to point channels [Koetter Effros Médard].
•
If
is a noisy point to point channel than it can be replaced
[Koetter Effros Médard]
with a “bit pipe”
of corresponding capacity.
N
•
Network Emulation
Nout
May sound intuitive but definitely not trivial!:
• Must prove that any coding scheme that allows comm. on N can
be converted to one for Nout: End to end Network Emulation.
• Must take into account that the link may appear in middle of
network and its output could be used in “crazy” ways.
• Reliable communication over N does not imply reliable
communication over all components of N.
35
What is known?
•
Multiple source/terminal channels:
•
What if
is, e.g., a broadcast channel?
[Koetter Effros Médard]
N
•
•
Nout
In this case (and others) it is known that preserving componentwise communication does not suffice for network emulation.
Major question: Which properties are needed from the
bounding component to allow network emulation?
X
Y1
Y2
X
Y1
Y2
36
Examples
37
[HoEffrosJalali]
The edge removal problem
What is the guarantee on loss in rate when
experiencing link failure?
S1
S2
S3
S4
N
e
T1
T2
T3
T4
Assume rate (R1,…,Rk) is achievable on network N.
Consider network N\e without edge e of capacity .
S1
S2
S3
S4
N\e
e
T1
T2
T3
T4
What can be said regarding the achievable rate on the new network?
S1
Edge removal
S2
e
S3
T1
T2
T3
S4
T4
What is the loss in rate when removing a  capacity edge?
• There exist simple instances in which removing an edge of
capacity  will decrease each rate by an additive .
• E.g.: the butterfly with bottleneck consisting of 1/
capacity .
S1
S1
S2
S2
S1
T2
S1+S2
S2
T1
edges of
R=(1,1) is achievable
R=(1-,1-) is achievable
• What is the “price of edge removal” in general?
39
Price of “edge removal”
In several special instances: the removal of a  capacity edge
causes at most an additive  decrease in rate [HoEffrosJalali].
• Multicast:   decrease in rate.
• Collocated sources:   decrease in rate.
• Linear codes:   decrease in rate.
N
• Is this true for all NC instances?
• Is the decrease in rate continuous as a function of ?
S1,...,S4
Seemingly simple problem: but currently open.
T1
T2
T3
T4
Edge removal in noisy networks
• In the case of noisy networks, the edge removal
statement does not hold.
• Adversarial noise (jamming):
X
Y
• Point to point communication.
x e y=x+e
• Adding a side channel of negligible capacity allows to send a
hash of message x between X and Y. Turning list decoding
into unique decoding [Guruswami] [Langberg].
• Significant difference in rate when edge removed.
• Memoryless noise:
Cooperation facilitator
X1
p(y|x1x2)
Y
X2
• Multiple access channel:
• Adding edges with negligible capacity allows to significantly
increase communication rate
[Noorzad Effros Langberg Ho].
What is the price of “edge removal”?
• Network coding: not known? Even for relaxed statement.
• Challenge, designing code for N given one for N\{e}.
• Nevertheless, may study implications if true … or false
…even for asymptotic version.
• Will show implications on:
• Reliability in network communication.
• Assumed topology of underlying network.
• Assumed demand structure in communication.
• Advantages in cooperation in network communication.
1.Reliability: Zero vs  error
S1
S2
S3
S4
N
T1
T2
T3
T4
Assume rate (R1,…,Rk) is achievable on network N with some
small probability of error >0.
What can be said regarding the achievable rate when
insisting on zero error?
What is the cost in rate when assuring zero error of
communication as opposed to  error?
Reliability: Zero vs  error
Can one obtain higher communication rate when allowing an
-error, as opposed to zero-error?
• In general communication models, when source
information is dependent, the answer is YES! [SlepianWolf].
[Witsenhausen]
X1
Y
X2
What about the Network Coding scenario in which source
information is independent and network is noiseless?
Is there advantage in  over zero error for general NC?
44
Price of zero error
S1,...,S4
N
T1
T2
T3
T4
What’s known:
• Multicast: Statement is true
• Collocated sources: Statement is true
• Linear codes: Statement is true
.
• Is statement true in general?
• Is the loss in rate continuous as a function of ?
[Li Yeung Cai] [Koetter Medard].
[Chan Grant] [Langberg Effros].
[Wong Langberg Effros]
Edge removal  zero error !
• Edge removal is true iff zero~ error in NC.
• Edge removal  zero error
:
[Chan Grant][Langberg Effros]
• Assume: Network N is R=(R ,…R )–feasible with  error.
• Assume: Asymptotic edge removal holds.
• Prove: Network N is R- feasible with zero error.
1
k
46
• Network communication challenging: combines topology
with information.
2.
Topology
of
networks.
•Reduction separates information from topology.
•Index
Coding
has only
network
node
• Recent
studies
have1 shown
that
anyperforms
network encoding.
coding
instance (NC) can be reduced to a simple instance
referred to as index coding (IC). [ElRouayheb Sprintson
• An efficient reduction that allows to solve NC using
Georghiades], [Effros ElRouayheb Langberg].
any scheme to solve IC.
s1
NC
s2
s1
s2
s3
s4
s5
s6
IC
t3
t1
t2
Obtain solution to NC
t1
t2
t3
t4
t5
t6
Solve IC
47
Reduction in code design: a code for IC corresponds to a
code for NC.
Connecting NC to IC
s1
s2
s1
s2
s3
s4
s5
s6
NC
IC
t1
t2
Obtain solution to NC
t1
t2
t3
t4
t5
t6
Solve IC
• Theorem: NC is R-feasible iff IC is R’=f(R) -feasible.
• Related question: can one determine capacity region of
NC with that of IC ?
• Surprisingly: currently no!
• Reduction breaks down with closure operation.
48
Edge removal resolves the Q
s1
s2
s1
s2
s3
s4
s5
s6
NC
IC
t1
t2
t1
t2
t3
t4
t5
t6
[Wong Langberg Effros]
Can determine capacity region of NC with that of IC
50
50
“Edge removal” implies:
• Zero ~  error in Network Coding.
• Reduction in capacity vs. reduction in code design.
• Advantages in cooperation in network
communication.
• Assumed demand structure in communication.
What can be said regarding the achievable rate
3. when
Source
dependence
the source
information is independent?
What acyclic
are themultiple
rate benefits
in
Let N be a directed
unicast network.
shared information/cooperation?
S1
T1
S2
T2
S3
T3
S4
T4
• Up to now we considered independent sources.
• In general, if source information is dependent, it is
“easier” to communicate (i.e., cooperation).
• Assume rate (R ,…,R ) is achievable when source
1
k
information S1,…,Sk is slightly dependent:
H(Si) - H(S1,…,Sk)  
Price of “independence”.
In several cases, there is a limited loss in rate when comparing
-dependent and independent source information [Langberg Effros].
• Multicast:   decrease in rate.
• Collocated sources:   decrease in rate.
• Is this true for all NC instances? S S
• Is the decrease in rate continuous as a function of N?
1,...,
H(Si) - H(S1,…,Sk)  
4
T1
T2
T3
T4
Edge removal  Source ind.
[Langberg Effros]
54
“Edge removal” implies:
• Zero =  error in Network Coding.
• Reduction in capacity vs. reduction in code design.
• Advantages in cooperation in network
communication.
• Multiple Unicast NC can be reduced to 2 unicast.
4. Network demands
• Recent studies have reduced any network
commination instance with multiple multicast
demands to a multiple unicast instance.
• Network Coding
• Linear Index Coding
• General (noisy) networks
[Dougherty Zeger]
zero error setting.
[Maleki Cadambe Jafar].
[Wong Langberg Effros].
56
Network demands
• For the case of Network Coding one
•
•
•
•
can further reduce to 2 unicast!
[Kamath Tse Wang].
Holds only in limited setting of code
design (not capacity) and only for
zero error.
Can one determine capacity of
multiple multicast networks using 2
unicast networks?
Again, reduction breaks down in
general setting.
Lets connect to edge removal …
57
Network demands
The asymptotic edge removal statement is true iff the
reduction of [Kamath Tse Wang] holds in capacity.
[Wong Effros Langberg].
NC: multiple multicast capacity can be
determined by 2 unicast capacity.
58
“Edge removal” equivalent:
• Zero =  error in Network Coding.
• Reduction in capacity vs. reduction in code design.
• Limited dependence in network coding implies
limited capacity advantage.
• Multiple Unicast NC can be reduced to 2 unicast.
• All form of slackness are equivalent.
• Reliability, closure, dependence, edge capacity.