Download Document

Document related concepts
no text concepts found
Transcript
Compressed Sensing MRI
2016.12.15
Fully sampled
6X undersampled
6X undersampled
with CS reconstruction
Lossy compression 失真壓縮




Reducing data size at cost of fidelity
Widespread applied to music, images and
movies: MP3, JPEG, H.264 (mpeg)
Most useful data are highly compressible (link)
Current model of data flow
Acquisition
Compression
First lady of the Internet
Application
Lose 96% weight
All bits of data are equal, but some bits are more equal than others.
Lossy compression 失真壓縮
At cost
of
fidelity?
??
14.8 kB
Low resolution
No compression
Lossy compression
“High” fidelity
2.2 kB
2.2 kB
High resolution
with “less” fidelity
Compressibility and Sparsity
Sparse: most numbers are zero or close to zero
x 0.81 =
-128
0
+127
= -0.56 x
x -0.45 =
= 0.31 x
x -0.09 =
= 0.087 x
x -0.04 =
= -0.003 x
x -0.02 =
= 0.0001 x
x 0.007 =
=0x
x 0.0006 =
x0=
x0=
= -0.38 x
= 0.12 x
= 0.0002 x
=0x
For example, under 2D “discrete cosine transform” (JPEG 1992)
Compressibility and Sparsity
Sparse: most numbers are zero or close to zero
-128
x 0.81 =
0
+127
= -0.56 x
x -0.45 =
x -0.09 =
x -0.04 =
x -0.02 =
x 0.007 =
= 0.31 x
= 0.087 x
= -0.003 x
= 0.0001 x
=0x
x 0.0006 =
x0=
x0=
= -0.38 x
= 0.12 x
= 0.0002 x
=0x
Compression by discard small component of discrete cosine transform.
MR medical images are sparse
Sparse after
wavelet transform
Sparse after finite
difference transform
Sparse after Fourier
transform in time
Noncompressible image
x 0.62 =
-128
0
+127
= -0.48 x
x -0.60 =
= 0.46 x
x -0.59 =
= 0.40 x
x -0.56 =
= -0.35 x
x -0.56 =
= 0.32 x
x 0.54 =
= 0.32 x
x 0.52 =
x 0.52 =
x 0.52=
= -0.38 x
= 0.32 x
= 0.30 x
= 0.24 x
White noise is not sparse to any transform, include DCT.
Compressed sensing (CS)

Current model of data flow
Acquisition
Compression
Application
First lady of the Internet

Lose 96% weight
Compressed sensing data flow
Compressed sensing
Application
Already compressed
Compressed sensing (CS)




Current model gathers much more data than
needed. Most could be discarded safely
Acquisition device must be fast, cheap, plenty
MR machine is slow, costly, scarce
Exploit image sparsity, CS MRI is possible
Compressed sensing
Application
Already compressed
A crash course of MRI principle
k space
Data
acquisition
image
DFT
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
A crash course of MRI principle
k space
Data
acquisition
image
DFT
Full acquisition
Reconstruction with partial information
Recovery?
missing
data
DFT
Reconstruction with partial information
Recovery?
DFT
a priori knowledge
Reconstruction with partial information
Recovery?
DFT
DWT
The a priori knowledge of
compressed sensing is assuming
the data is sparse
in some basis, such as wavelet basis.
sparse
It seems very difficult….



Alice
Bob
Eve
The a priori knowledge of
compressed sensing is assuming
the data is sparse
in some basis, such as wavelet basis.
Localization make thing easier….



小王 (大喬 飾)
小柯 (小喬 飾)
小黃 (由各位飾演)
The a priori knowledge of
compressed sensing is assuming
the data is sparse
in some basis, such as wavelet basis.
Compressed sensing: minimal example
「這年頭,想要在海外置產不容易。如果小柯你
海外豪宅分我一半,我就有十一棟了。」
『小王,不要太貪心。我們兩人的海外豪宅,總
共比我的助理多十二棟呢。』
「噓,小黃正在偷聽,別再說了。再見。」
Compressed sensing: minimal example



「這年頭,想要在海外
置產不容易。如果小柯
你海外豪宅分我一半,
我就有十一棟了。」
小王 + ½小柯 = 11
小王 + 小柯 - 助理 = 12
How many does each have?

『小王,不要太貪心。
我們兩人的海外豪宅,
總共比我的助理多十二
棟呢。』
2 equations with 3 unknowns,
many solutions exist
「噓,小黃正在偷聽,
別再說了。再見。」
小王
10
9
8
7
6
5
4
3
2
1
0
小柯
2
4
6
8
10
12
14
16
18
20
22
助理
0
1
2
3
4
5
6
7
8
9
10
Compressed sensing: minimal example



「這年頭,想要在海外
置產不容易。如果小柯
你海外豪宅分我一半,
我就有十一棟了。」
小王 + ½小柯 = 11
小王 + 小柯 - 助理 = 12
Oversea mansions are sparse


Sparse: most numbers are zero or
close to zero
The sparsest solution:
『小王,不要太貪心。
我們兩人的海外豪宅,
總共比我的助理多十二
棟呢。』
「噓,小黃正在偷聽,
別再說了。再見。」
小王
10
9
8
7
6
5
4
3
2
1
0
小柯
2
4
6
8
10
12
14
16
18
20
22
助理
0
1
2
3
4
5
6
7
8
9
10
ℓ0 , ℓ1 and ℓ2 (pseudo)norms

ℓ0 pseudonorm: number of nonzero components


ℓ1 norm: sum of all components


This is the definition of sparsity
The sparsest solution is minimal in ℓ1 “incidentally”
ℓ2 norm: root of sum-squares
This solution is minimal in ℓ0 and ℓ1 (pseudo)norm, but not in ℓ2 norm.
小王
10
9
8
7
6
5
4
3
2
1
0
小柯
2
4
6
8
10
12
14
16
18
20
22
助理
0
1
2
3
4
5
6
7
8
9
10
Incoherence

What if the scenario is:





「小柯,我知道你的海外豪宅有兩棟。」
『但是我的助理一棟都沒有。』
We will never know how much does 小王 has
Incoherence: Each sampled data should
involves the basis as evenly as possible in
the transformed domain
Random sampling is incoherent relative to
any basis, but not always applicable
如果各位要寫作業的話…

Sparsity: few nonzero components in the
transformed domain


Incoherence: Each sampled data should
involves the basis as evenly as possible in
the transformed domain
Incoherence and sparsity are the keys to
successful compressed sensing
How much sampling is enough?


Signal size: n
Sampling number: m
Nyquist sampling
For example, n = 512 x 512 = 262,144, log n = 5.4
How much sampling is enough?



Signal size: n
Sampling number: m
Sparse: S nonzero component
Nyquist sampling
“Just enough” sampling
For example, n = 512 x 512 = 262,144, log n = 5.4
How much sampling is enough?




Signal size: n
Sampling number: m
Sparse: S nonzero component
Incoherence: u

u = 1 maximally incoherent, usually u ~ 2
Nyquist sampling
Compressed sensing
“Just enough” sampling
For example, n = 512 x 512 = 262,144, log n = 5.4
Compressed sensing, theorem 1

Randomly acquiring m samples, m >
a (strictly) S-sparse signal is recovered with probability > 1-
Find k in Rn
Minimize || DWT(DFT(k)) ||1
Subject to ki = Ki
i = 1… m
Convex optimization problem
Efficient algorithm exists
Compressed sensing, theorem 1
DFT
DWT
sparse
Find k in Rn
Minimize || DWT(DFT(k)) ||1
Subject to ki = Ki
i = 1… m
Convex optimization problem
Efficient algorithm exists
Compressed sensing, theorem 1

Randomly acquiring m samples, m >
a (strictly) S-sparse signal is recovered with probability > 1-
What if the signal is only
approximately S-sparse,
and noisy?
Compressed sensing, theorem 2
approx.
S-sparse
recovery
error
noisy
level
Find k in Rn
Minimize || DWT(DFT(k)) ||1
Subject to ki ≒ Ki
i = 1… m
Convex optimization problem
Efficient algorithm
Point spread function (PSF) in 1-dimension:
Thresholding
Thresholding
random sampling
Regular sampling
Imaging space
k-space
Ambiguity!
模擬 subsampling
所產生的雜訊
“模擬 subsampling 所產生的雜訊”: Point spread function
The more evenly spread out of the noise, the better.
Point spread function
in 2-dimension
“模擬 subsampling 所產生的雜訊”:
Point spread function
The more evenly spread out of the
noise, the better: incoherence
Incoherent sampling: PSF in 2-dimension
“模擬 subsampling 所產生的雜訊”: Point spread function
The more evenly spread out of the noise, the better: incoherence
Summary of compressed sensing MRI




MRI images are sparse
Nonrandom incoherent k-space trajectories
Compressed sensing can achieve similar
images quality using sub-Nyquist sampling
5X to 10X speed up

Advantages…
Summary of compressed sensing MRI




MRI images are sparse
Nonrandom incoherent k-space trajectories
Compressed sensing can achieve similar
images quality using sub-Nyquist sampling
Make
5X to 10X speed up
Ultrafast
screening
for stroke
More
NEX
Larger
FOV
timeconsuming
One breath-hold
scan probableNo
body images
motion
No anesthesia
artifact
for babies
No more
Higher
overtime
resolution
work
What’s next? Beyond sparsity



Sparsity is a rudimentary a priori knowledge
Can we expand a priori knowledge by
machine learning?
Beyond sparsity
Thanks for your attention
以下為備用投影片
It’s Showtime!
Original image
6X subsampling with CS
Original 6X subsampling
It’s Showtime!
Original image
6X subsampling with CS
Original 6X subsampling
It’s Showtime!
Free breathing
whole liver perfusion
One breath-hold
whole heart perfusion
It’s Showtime!
Submillimeter
resolution T1WI +C
in a 8-year-old boy
Radiology. 2010 August; 256(2): 607–616
It’s Showtime!
Breath-hold CE MRA
Radiology. 2010 August; 256(2): 607–616
It’s Showtime!
3-year-old boy with left hepatic lobe transplant, SSFP MRCP
Radiology. 2010 August; 256(2): 607–616
Thanks for your attention
Nyquist rate: comprehensive sampling



Sampling freq > maximal freq in signal * 2
Guarantee perfect signal reconstruction
Sub-Nyquist sampling


Aliasing
高頻訊號誤為低頻訊號
Moiré pattern; wagon-wheel effect; phase wrapping.
Compressed sensing MRI (finally!)
Limitation in maximal slew rate and gradient field
Compressed sensing MRI
The k-space is traversed by various
trajectory, and sampled with Nyquist
rate.
The coverage determines resolution;
the density determines FOV.
Violation of the sampling requirement
results in imaging artifacts.
Compressed sensing MRI
High quality
image
Loss of
resolution
Small FOV
wrapping
Random
noise
To really achieve incoherence in MRI


Random sampling is incoherent to any
transform
In MRI, real random sampling is not practical



Slewmax and Gmax
K-space trajectory should be smooth lines
Devise nonrandom incoherent sampling
with smooth k-space trajectories
Incoherent sampling in time
Exploit sparsity after Fourier transform in time
An easier(?) diagram
non-zero component
sampling number
“Just enough” sampling
sampling number
signal size
probability of accurate
reconstruction
Fully sampled at Nyquist rate
An easier(?) diagram
non-zero component
sampling number
“Just enough” sampling
sampling number
signal size
probability of accurate
reconstruction
Fully sampled at Nyquist rate
This is a hyperbola. Sparsity ratio S/n = 0.2
A more elaborate example
–sub-Nyquist sampling
Random sampling is
maximally incoherent
“random”
noise
Fourier
transform
Imaging space
k-space
Sampling in regular interval
is not incoherent to Fourier
basis
Ambiguity!
Aliasing
A more elaborate example
–sub-Nyquist sampling
thresholding
thresholding
Ambiguity!
無路可走
模擬subsampling
所產生的雜訊
Related documents