Download Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Signals intelligence wikipedia , lookup

Computer simulation wikipedia , lookup

Pattern recognition wikipedia , lookup

Algorithm characterizations wikipedia , lookup

Genetic algorithm wikipedia , lookup

Factorization of polynomials over finite fields wikipedia , lookup

Theoretical computer science wikipedia , lookup

Kalman filter wikipedia , lookup

Least squares wikipedia , lookup

Algorithm wikipedia , lookup

Operational transformation wikipedia , lookup

Multidimensional empirical mode decomposition wikipedia , lookup

Data assimilation wikipedia , lookup

Transcript
Closed-Form MSE Performance of
the Distributed LMS Algorithm
Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis
ECE Department, University of Minnesota
Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011
USDoD ARO grant no. W911NF-05-1-0283
1
Motivation
 Estimation using ad hoc WSNs raises exciting challenges
 Communication constraints
Single-hop communications
 Limited power budget
 Lack of hierarchy / decentralized processing
Consensus
 Unique features
 Environment is constantly changing (e.g., WSN topology)
 Lack of statistical information at sensor-level
 Bottom line: algorithms are required to be
 Resource efficient
 Simple and flexible
 Adaptive and robust to changes
2
Prior Works
 Single-shot distributed estimation algorithms
 Consensus averaging [Xiao-Boyd ’05, Tsitsiklis-Bertsekas ’86, ’97]
 Incremental strategies [Rabbat-Nowak etal ’05]
 Deterministic and random parameter estimation [Schizas etal ’06]
 Consensus-based Kalman tracking using ad hoc WSNs
 MSE optimal filtering and smoothing [Schizas etal ’07]
 Suboptimal approaches [Olfati-Saber ’05], [Spanos etal ’05]
 Distributed adaptive estimation and filtering
 LMS and RLS learning rules [Lopes-Sayed ’06 ’07]
3
Problem Statement
 Ad hoc WSN with sensors
 Single-hop communications only. Sensor ‘s neighborhood
 Connectivity information captured in
 Zero-mean additive (e.g., Rx) noise
 Goal: estimate a signal vector
 Each sensor
, at time instant
 Acquires a regressor
and scalar observation
 Both zero-mean and spatially uncorrelated
 Least-mean squares (LMS) estimation problem of interest
4
Power Spectrum Estimation
 Find spectral peaks of a narrowband (e.g., seismic) source
 AR
model:
 Source-sensor multi-path channels modeled as FIR filters
 Unknown orders
and tap coefficients
 Observation at sensor is
 Define:
 Challenges
 Data model
not completely known
 Channel fades at the frequencies occupied by
5
A Useful Reformulation

Introduce the bridge sensor subset
1)
2)

For all sensors
,
such that
For
, a path connecting them devoid of edges
linking two sensors
Consider the convex, constrained optimization
Proposition [Schizas etal’06]: For
WSN is connected, then
satisfying 1)-2) and the
6
Algorithm Construction

Associated augmented Lagrangian

Two key steps in deriving D-LMS
1)
Resort to the alternating-direction method of multipliers
Gain desired degree of parallelization
2)
Apply stochastic approximation ideas
Cope with unavailability of statistical information
7
D-LMS Recursions and Operation

In the presence of communication noise, for
and
Step 1:
Step 2:
Step 3:
Steps 1,2:
Rx
from
Sensor

Step 3:
Tx
Rx
to
from
Tx
to
Bridge sensor
Simple, distributed, only single-hop exchanges needed
8
Error-form D-LMS

Study the dynamics of


Local estimation errors:
Local sum of multipliers:
(a1) Sensor observations obey
where the zero-mean white noise

Introduce
and
Lemma: Under (a1), for
and
and
has variance
then
where
consists of the blocks
with
9
Performance Metrics
 Local (per-sensor) and global (network-wide) metrics of interest
(a2)
(a3)
is white Gaussian with covariance matrix
and
are independent
 Define
 Customary figures of merit



MSD
EMSE
Local
Global
10
Tracking Performance
(a4) Random-walk model:
mean white with covariance
 Let
 Convenient c.v.:
where
; independent of
where
Proposition: Under (a2)-(a4), the covariance matrix of
with
is zeroand
obeys
. Equivalently, after vectorization
where
11
Stability and S.S. Performance
Proposition: Under (a1)-(a4), the D-LMS algorithm achieves
consensus in the mean, i.e.,
the step-size is chosen such that
provided
with
 MSE stability follows
 Intractable to obtain explicit bounds on
Proposition: Under (a1)-(a4), the D-LMS algorithm is MSE
stable for sufficiently small
 From stability,
 The fixed point of
has bounded entries
is
 Enables evaluation of all figures of merit in s.s.
12
Step-size Optimization
 If
optimum
minimizing EMSE
 Not surprising
 Excessive adaptation
MSE inflation
 Vanishing
tracking ability lost
 Recall
 Hard to obtain closed-form
, but easy numerically (1-D).
13
Simulated Tests
node WSN, Rx AWGN w/
Regressors:
,
w/
;
i.i.d.;
w/
Observations: linear data model, WGN w/
, D-LMS:
Time-invariant parameter:
Random-walk model:
14
Concluding Summary
 Developed a distributed LMS algorithm for general ad hoc WSNs
 Detailed MSE performance analysis for D-LMS
 Stationary setup, time-invariant parameter
 Tracking a random-walk
 Analysis under the simplifying white Gaussian setting
 Closed-form, exact recursion for the global error covariance matrix
 Local and network-wide figures of merit for
and in s.s.
 Tracking analysis revealed
minimizing the s.s. EMSE
 Simulations validate the theoretical findings
 Results extend to temporally-correlated (non-) Gaussian sensor data
15