2. Outline
November 2017The University of Arizona
• Motivation
• Preliminaries on Compressive Sensing
• Sequential Probability Ratio Test: Recap
• Work Discussion
2
3. Compressive Sensing
November 2017The University of Arizona 3
s
sparse signalN ⇥ 1
K (=3) non-zero
elements
Sparsity of order K
M ⇥ N random matrix
Gaussian distribution
=
y
Dimensionality Reduction
M ⇥ 1 measurements
K < M << N
Scanning the support of the
signal has exponential
complexity!
4. How to recover the sparse signal?
November 2017The University of Arizona 4
RN
dimensional space
s
{s0
: y = s0
}
(1 )ksk2
2 k sk2
2 (1 + )ksk2
2
Restricted Isometry Property:
Gaussian distribution works!
M 2K log
✓
N
M
◆
Required number of measurements:
RM
s
5. How to recover the sparse signal? (Cont’d)
November 2017The University of Arizona 5
RN
dimensional space
s
{s0
: y = s0
}
Question: How to estimate the signal?
l2 norm recovery
ˆs
Not accurate!
ˆs = arg min
y= s0
ksk2
ksk2
2 = |s1|2
+ |s2|2
+ · · · + |sN |2
Where,
RM
6. How to recover the sparse signal? (Cont’d)
November 2017The University of Arizona 6
RN
dimensional space
s
{s0
: y = s0
}
Question: How to estimate the signal?
ˆsl1 norm recovery
ˆs = arg min
y= s0
ksk1
ksk1 = |s1| + |s2| + · · · + |sN |
Where,
RM
7. Stopping Problem: Recap
November 2017The University of Arizona
• Stopping problems are a simple but important class of learning problems.
•In this problem class, information arrives over time, and we have to
choose whether to view the information or stop and make a decision.
7
Event Detector
t=0 t=1 t=2 t=3
. . .
t=10
Input Signal
Sudden Change in observation
t=0 t=1 t=2 t=3
. . .
t=10
Ideal case: Stop observing and detect the change accurately!
Output Signal
8. Solution of the problem
November 2017The University of Arizona 8
⇢0
0
Risk
0.5
1 ⇢0
0 ⇢0
0
Risk is a concave function
⇢L ⇢U
Stop and decide H1Stop and decide H0
Continue updating priors
9. Solution of the problem (Cont’d)
November 2017The University of Arizona 9
• Now we are interested to obtain ⇢L
, ⇢U
• The updated prior is
⇢n+1
0 =
Ln
(Sn
)
Ln(Sn) + ⇢0/(1 ⇢0)
Ln
(Sn
) =
nY
k=1
P1(Wk
)
P0(Wk)
• The likelihood ratio is defined as
10. Solution of the problem (Cont’d)
November 2017The University of Arizona 10
• Determining if
⇢n+1
0 =
Ln
(Sn
)
Ln(Sn) + ⇢0/(1 ⇢0)
⇢n+1
0 (Sn
) ⇢L
or ⇢n+1
0 (Sn
) ⇢U
is the same as testing
Ln
(Sn
) Aor Ln
(Sn
) B
• Why?
Quasi linear function
0
Increasing function for Ln
(Sn
) 0
11. Solution of the problem (Cont’d)
November 2017The University of Arizona 11
• The new bounds A and B are obtained as
A =
⇢n
0 ⇢L
(1 ⇢n
0 )(1 ⇢L)
B =
⇢n
0 ⇢U
(1 ⇢n
0 )(1 ⇢U )
• Now the decision rule is
Ln
(Sn
) =
8
><
>:
B stop and chooseY n
= 1
A stop and choose Y n
= 0
otherwise continue observing
12. Solution of the problem (Cont’d)
November 2017The University of Arizona 12
• Since getting A, B exactly is difficult
P⇡
F ⇡
1 A
B A
P⇡
M ⇡
A(B 1)
B A
Wald’s approximation
• Then for an acceptable P⇡
F , P⇡
M
A =
P⇡
M
1 P⇡
F
B =
1 P⇡
M
P⇡
F
14. Hypothesis Testing
November 2017The University of Arizona 14
Random Deterministic (Today’s Talk)
H1 : yi = s + ni
H0 : yi = ni noise only
signal + noise
15. Hypothesis Testing
November 2017The University of Arizona 15
Likelihood functions:
f(yi|H0) =
exp( 1
2 yT
i ( 1 t
) 1
yi)
| 1 |1/2(2⇡)M/2
f(yi|H1) =
exp( 1
2 (yi s)T
( 1 t
) 1
(yi s)
| 1 |1/2(2⇡)M/2
=
1
2
(noise variance)
Typo: T
16. Sequential Probability Ratio Test
November 2017The University of Arizona 16
• We need to compute the likelihood function
Ln
(Sn
) =
nY
i=1
f1(yi|H1)
f0(yi|H0)
Ln
(Sn
) =
8
><
>:
B stop and chooseY n
= 1
A stop and choose Y n
= 0
otherwise continue observing
• Remember the decision rule
17. Sequential Probability Ratio Test
November 2017The University of Arizona 17
• After algebraic simplifications
⇤(y) 1 H1conclude
⇤(y) 2 conclude H0
2 ⇤(y) 1 continue observing
Decision statistic
• Where,
⇤(y) =
nX
i=1
yT
i ( T
) 1
s
1 = 2 log(B) + n( s)T
( T
) 1
s
2 = 2 log(A) + n( s)T
( T
) 1
s
18. Performance Characterization
November 2017The University of Arizona 18
• Average probability of error
Pe = P(H0)Pfa + P(H1)(1 Pd)
Pfa = P(⇤(y) 1|H0)
Pd = P(⇤(y) 1|H1)
• We assume,
P(H0) = P(H1) = 1/2
• We get,
ˆP = T
( T
) 1
,
False alarm probability
Detection probability
Pe = Q
✓
1
2
r
n
1
kˆPsk
◆
A = B = 1,
In SPRT, PfaPd and
are predefined
19. Performance Characterization (Cont’d)
November 2017The University of Arizona 19
Pe = Q
✓
1
2
r
n
1
kˆPsk
◆
This expression does not
show the impact of
compressed measurement!
stable embedding Theorem
Davenport, Mark A., et al. "Signal processing with compressive
measurements." IEEE Journal of Selected Topics in Signal Processing 4.2
(2010): 445-460.
Suppose that
r
N
M
ˆP provides a -stable embedding property.
It satisfies!
Then for any
deterministic signal s, the probability of error has the following bounds:
Q
✓
p
1 +
p
n
2
r
M
N
ksk2
p
1
◆
Pe Q
✓
p
1
p
n
2
r
M
N
ksk2
p
1
◆
Theorem:
20. Performance Characterization (Cont’d)
November 2017The University of Arizona 20
• Then, approximately
Pe ⇡ Q
✓p
n
2
r
M
N
ksk2
p
1
◆
Compression ratio < 1
Time
p
SNR
@ SNR = 3 dB
Comment: monotonically decreasing function of compression ratio and number of time.
21. Simulation Results
November 2017The University of Arizona 21
Less measurements, more delay!
Compression ratio (CR) =
M
N
Sparsity order (K) = 5
Signal dimension (N) =100
Number of measurements
Pd = 0.1, Pfa = 0.1
Monte-Carlo Simulation (number of iterations = 10000)
22. Simulation Results
November 2017The University of Arizona 22
Compression ratio (CR) =
M
N
Sparsity order (K) = 5
Signal dimension (N) =50
Number of measurements
Pd = 0.1, Pfa = 0.1
Monte-Carlo Simulation (number of iterations = 100) (lack of time)
Reconstruction using l1 norm algorithm
min
s
ksk1
s.t. ky sk2
P(ˆs 6= s)
M = 10
23. Future Work
November 2017The University of Arizona
• Refining the results
• Studying the SPRT mechanism for random signals
• Expecting a research paper