Download Cooperative Communication

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Wireless Communication
Elec 534
Set IV
October 23, 2007
Behnaam Aazhang
Reading for Set 4
• Tse and Viswanath
– Chapters 7,8
– Appendices B.6,B.7
• Goldsmith
– Chapters 10
Outline
•
•
•
•
Channel model
Basics of multiuser systems
Basics of information theory
Information capacity of single antenna single
user channels
–
–
–
AWGN channels
Ergodic fast fading channels
Slow fading channels
•
•
Outage probability
Outage capacity
Outline
• Communication with additional dimensions
– Multiple input multiple output (MIMO)
• Achievable rates
• Diversity multiplexing tradeoff
• Transmission techniques
– User cooperation
• Achievable rates
• Transmission techniques
Dimension
• Signals for communication
– Time period T
– Bandwidth W
– 2WT natural real dimensions
• Achievable rate per real dimension
Pav
1
log( 1  2 )
2
n
Communication with Additional
Dimensions: An Example
• Adding the Q channel
– BPSK to QPSK
• Modulated both real and
imaginary signal dimensions
• Double the data rate
• Same bit error probability
2 Eb
Pe  Q(
)
N0
Communication with Additional
Dimensions
• Larger signal dimension--larger capacity
– Linear relation
• Other degrees of freedom (beyond signaling)
– Spatial
– Cooperation
• Metric to measure impact on
– Rate (multiplexing)
– Reliability (diversity)
• Same metric for
– Feedback
– Opportunistic access
Multiplexing Gain
• Additional dimension used to gain in rate
• Unit benchmark: capacity of single link AWGN
C ( SNR)  log( 1  SNR) bit per second per Hertz
• Definition of multiplexing gain
C ( SNR)
r  lim
SNR  log( SNR)
Diversity Gain
• Dimension used to improve reliability
• Unit benchmark: single link Rayleigh fading
channel
 out
1

SNR
• Definition of diversity gain
log(  out ( SNR))
d   lim
SNR 
log( SNR)
Multiple Antennas
• Improve fading and increase data rate
• Additional degrees of freedom
– virtual/physical channels
– tradeoff between diversity and multiplexing
Transmitter
Receiver
Multiple Antennas
• The model
rM R Tc  H M R M T bM T Tc  nM R Tc
where Tc is the coherence time
Transmitter
Receiver
Basic Assumption
• The additive noise is Gaussian
nM R
N0
~ Gaussian (0,
I M R M R )
2
• The average power constraint
Trace{E[ bMT b ]}  Pav
H
MT
Matrices
• A channel matrix
H M R M T
 h11*
 h11  h1M R 


 H
 

 , H M T M R   
 h*
hM 1  hM M 
T
R 
 T
 1M R
• Trace of a square matrix
M
Trace[ H M M ]   hii
i 1
 hM* T 1 


 
*
 hM M 
T R 
Matrices
• The Frobenius norm
H
 Trace[ HH ]  Trace[ H H ]
H
F
H
• Rank of a matrix = number of linearly
independent rows or column
Rank [ H ]  min{ M R , MT }
• Full rank if
Rank [ H ]  min{ M R , MT }
Matrices
• A square matrix is invertible if there is a matrix
1
AA  I
• The determinant—a measure of how
noninvertible a matrix is!
• A square invertible matrix U is unitary if
UU H  I
Matrices
• Vector X is rotated and scaled by a matrix A
y  Ax
• A vector X is called the eigenvector of the matrix
and lambda is the eigenvalue if
Ax  x
• Then
A  U U
H
with unitary and diagonal matrices
Matrices
• The columns of unitary matrix U are
eigenvectors of A
• Determinant is the product of all eigenvalues
• The diagonal matrix
 1  0 


    
0   
N 

Matrices
• If H is a non square matrix then
H M R M T  U M R M R  M RMT V
H
M T M T
• Unitary U with columns as the left singular vectors
and unitary V matrix with columns as the right
singular vectors
• The diagonal matrix 1  0 
 M R M T

 0

0


  1  0


  M T or    
 
 0   0  M R
 

0 




0 
Matrices
• The singular values of H are square root of
eigenvalues of square H
i  singular( H M
R M T
)
 i  eigenvalue ( H M R M T H MHT M R )
MIMO Channels
• There are M T  M R channels
– Independent if
• Sufficient separation compared to carrier wavelength
• Rich scattering
– At transmitter
– At receiver
• The number of singular vectors of the channel
 min{ MT , M R }
• The singular vectors are the additional (spatial)
degrees of freedom
Channel State Information
• More critical than SISO
– CSI at transmitter and received
– CSI at receiver
– No CSI
• Forward training
• Feedback or reverse training
Fixed MIMO Channel
• A vector/matrix extension of SISO results
• Very large coherence time
I (rM R ; bM T | H M R M T )
 h(rM R | H M R M T )  h(rM R | bM T , H M R M T )
 h(rM R | H M R M T )  h(nM R )
 h(rM R | H M R M T )  M R log( eN 0 )
 log[( e) M R det( N 0 I M R M R  HQH * )]  M R log( eN 0 )
Exercise
• Show that if X is a complex random vector
with covariance matrix Q its differential
entropy is largest if it was Gaussian
Solution
• Consider a vector Y with the covariance as X
h(Y )  h( X )    fY log fY dY   f Gaussian log f GaussiandX
   fY log fY dY   fY log f GaussiandY

f Gaussian
fY log
dY  0
fY
Solution
• Since X and Y have the same covariance Q
then
f
log f GaussiandX   f Gaussian[ X QX ]dX 
*
Gaussian
  fY [Y QY ]dY
*
  fY log f GaussiandY
Fixed Channel
• The achievable rate
max I (b; r )  log[( e) M R det( N 0 I M R M R  HQH * )]  M R log( eN 0 )
pb
 log det( I M R M R
HQH *

)
N0

*
Q

E
[
bb
]
and
E
[
b
b]  Pav
with MT MT
• Differential entropy maximizer is a complex
Gaussian random vector with some covariance
matrix Q
Fixed Channel
• Finding optimum input covariance
• Singular value decomposition of H
H  UV * 
min{ M R , M T }
*

u
v
 mmm
m 1
• The equivalent channel
~
~
*
~
~
~
rM R   M R M T bM T  nM R with r  U r and b  V *b
Parallel Channels
• At most min{ MT , M R } parallel channels
~ ~
~
rm  mbm  nm ; m  1,2,, min {M R ,M T }
• Power distribution across parallel channels
QMT MT  E[bb ] and E[b*b]  tr (Q)  tr (VQV * )  Pav
Parallel Channels
• A few useful notes
*
log det( I M R M R
HQH

)  log det( I M T M T 
N0
 log det( I M R M R 
 log det( I M R M R
QM T M T H M* T M R H M R M T
N0
 M R M T VM T M T QV **M T M R
~ *
Q

)
N0
~ 2
 log  (1  Qmmm )
m
)
N0
)
Parallel Channels
• A note
~ *
Q
~ 2
det( I M R M R 
)  (1  Qmmm )
m
N0
~
with equality w hen Q is diagonal
Fixed Channel
• Diagonal entries found via water filling
• Achievable rate
I ( r ; b) 
min{ M R , M T }

m 1
P
log( 1 
)
N0
* 2
m m
with power
P  ( 
*
m
N0

2
m

) with
P
*
m
m
 Pav
Example
• Consider a 2x3 channel
H 32
1 1  1 / 3 


 1 1   1 / 3  6
1 1  1 / 3 

 

  1 / 2
1/ 2
• The mutual information is maximized at
6P
*
I (r; b)  log( 1 
) with E[bi b j ]  P / 2
N0

Example
• Consider a 3x3 channel
1 0 0


H   0 1 0
0 0 1


• Mutual information is maximized by
P
P
I (r ; b)  3 log( 1 
) with Q  I 33
3N 0
3
Ergodic MIMO Channels
•
•
•
•
A new realization on each channel use
No CSI
CSIR
CSITR?
Fast Fading MIMO with CSIR
• Entries of H are independent and each complex
Gaussian with zero mean
• If V and U are unitary then distribution of H is the
same as UHV*
• The rate
I ( H M R M T , rM R ; bM T )  I ( H ; b)  I (r; b | H )
 I (r; b | H )  E[ I (r; b | H  h]
MIMO with CSIR
• The achievable rate
max I (r; b)  log[( e) M R det( N 0 I M R M R  HQH * )]  M R log( eN 0 )
pb
since the differential entropy maximizer is a
complex Gaussian random vector with some
covariance matrix Q
Fast Fading and CSIR
• Finally,
I (b; r )  E[log det( I M R M R
HQH 

)]
N0

with Q

E
[
bb
]
M T M T

E
[
b
b]  Pav
• The scalar power constraint
• The capacity achieving signal is circularly
symmetric complex Gaussian (0,Q)
MIMO CSIR
• Since Q is non-negative definite Q=UDU*
HQH 
( HU ) D( HU )*
I (b; r )  E[log det( I 
)]  E[log det( I 
)]
N0
N0
• Focus on non-negative definite diagonal Q
• Further, optimum Q  I M T  M T

Pav HH
I (b; r )  E[log det( I 
)]
M T N0
Rayleigh Fading MIMO
• CSIR achievable rate

Pav HH
I (b; r )  E[log det( I 
)]
M T N0
• Complex Gaussian distribution on H
• The square matrix W=HH*
– Wishart distribution
– Non negative definite
– Distribution of eigenvalues
Ergodic / Fast Fading
• The channel coherence time is Tc  1
• The channel known at the receiver
C  E{log det( I M R M R
Pav

HH  )}
M T N0
• The capacity achieving signal b must be
circularly symmetric complex Gaussian
(0, ( Pav / M T ) I M T M T )
Slow Fading MIMO
• A channel realization is valid for the duration of
the code (or transmission)
• There is a non zero probability that the channel
can not sustain any rate
• Shannon capacity is zero
Slow Fading Channel
• If the coherence time Tc is the block length
I (r ; b)  log det( I M R M R 
H M R M T QH M* T M R
N0
)
• The outage probability with CSIR only
 out ( R, Pav )  inf Pr[log det( I M R M R
Q
with E[bb]  Pav and Q  E[bb ]
HQH 

)  R]
N0
Slow Fading
• Since
Pr[log det( I M R M R
HQH 
HUQU * H 

)  R]  Pr[log det( I M R M R 
)  R]
N0
N0
• Diagonal Q is optimum
• Conjecture: optimum Q is
Qopt
1



 1


1
Pav 



m 



0




0


Example
• Slow fading SIMO, M T  1
• Then Qopt  Pav and
Pr[log det( I M R M R
Pav H * H
HQH 

)  R]  Pr[log( 1 
)  R]
N0
N0
• Scalar H * H is  2 distribute d
N 0 ( e R 1)
Pav
M R 1 u
 out ( R, Pav ) 
u
e du
0
( M R )
Example
• Slow fading MISO, M R  1
• The optimum
Qopt
Pav

I mm for some m  M T
m
• The outage
N 0 m ( e R 1)
Pa v
m 1 u
*
Pav HH
Pr[log( 1 
)  R] 
mN0
u
e du
0
 ( m)
Diversity and Multiplexing for MIMO
• The capacity increase with SNR
SNR
C  k log( 1 
)
k
• The multiplexing gain
C ( SNR)
r  lim
SNR  log( SNR)
Diversity versus Multiplexing
• The error measure decreases with SNR
increase  SNR  d
• The diversity gain
log(  out ( SNR))
d   lim
SNR 
log( SNR)
• Tradeoff between diversity and multiplexing
– Simple in single link/antenna fading channels
Coding for Fading Channels
• Coding provides temporal diversity
FER  g c SNR
d
or
P(C  E )  g c SNR  d
• Degrees of freedom
– Redundancy
– No increase in data rate
M versus D
Diversity Gain
(0,MRMT)
(min(MR,MT),0)
Multiplexing Gain
Related documents