Compressive sensing:
Fulltekst
(2) First Part: ‣. Sparsity, low-rankness and relatives: “From information to structures”. ‣. Compress while you sample: “From structure to scrambled sensing”. ‣. and Reconstruct! “From scrambled sensing to information”. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 2.
(3) First Part: ‣. Sparsity, low-rankness and relatives: “From information to structures”. ‣. Compress while you sample: “From structure to scrambled sensing”. ‣. and Reconstruct! “From scrambled sensing to information”. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 3.
(4) Informative signals are composed of structures .... 3-D data. Speech signal. Biology. Video. ELEN. Data on Graph. Astronomy. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 4.
(5) 2-D Example: Using Wavelets! Representing this image .... x. 50. 100. 150. 200. 250. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 5.
(6) 2-D Example: Using Wavelets! ...with those “wavelets” Representing this image ... e.g.,. x. 50. 100. different sizes, scales. 150. 200. 250. different orientations ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 6.
(7) 2-D Example: Using Wavelets! ...with those “wavelets” Representing this image ... e.g.,. x. 50. 100. different sizes, scales. 150. 200. 250. different orientations ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 7.
(8) 2-D Example: Using Wavelets!. x. ‣ 50. 100. Set of coefficients in (gray=0). 150. 200. 250. 50. 100. Wavelet basis. 150. 200. 250 50. 100. ↵ s.t. x = ELEN. P 150. i. 200. ↵i. 250. i. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 8.
(9) 2-D Example: Using Wavelets!. x. ‣ 50. 100. Set of coefficients in (gray=0). 150. 200. 250. 50. 100. Wavelet basis. ↵K. 150. 200. 250 50. 100. ↵ s.t. x = ELEN. P 150. i. Thresholding = Keep K first values 200. ↵i. 250. i. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 9.
(10) 2-D Example: Using Wavelets!. x. ‣ 50. xK =. K = N/10. K. 50. 100. 100. Set of coefficients in (gray=0). 150. 200. 150. 250. 200. 50. 250. 100. Wavelet basis. 50. 150. ↵K. 150. 200. 250 50. 100. ↵ s.t. x = ELEN. 100. on” i t a rm o f n ! I “ e r e h lt l i t s 250 is 200. P 150. i. Thresholding = Keep K first values 200. ↵i. 250. i. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 10.
(11) Generalization: Sparsity principle ‣. Hypothesis: an image (or any signal) can be decomposed in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j j=1. ELEN. j. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 11.
(12) Generalization: Sparsity principle ‣. Hypothesis: an image (or any signal) can be decomposed in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j j=1. ‣. j. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 12.
(13) Generalization: Sparsity principle Hypothesis: an image (or any signal) can be decomposed. ‣. in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j j=1. j. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ‣. f. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 13.
(14) Generalization: Sparsity principle Hypothesis: an image (or any signal) can be decomposed. ‣. in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j. j. j=1. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ‣. f. atom. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 14.
(15) Generalization: Sparsity principle Hypothesis: an image (or any signal) can be decomposed. ‣. in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j. j. j=1. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ‣. f. atom. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 15.
(16) Generalization: Sparsity principle Hypothesis: an image (or any signal) can be decomposed. ‣. in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j. j. j=1. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ‣. f. atom. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 16.
(17) Generalization: Sparsity principle Hypothesis: an image (or any signal) can be decomposed. ‣. in a “sparsity basis” Ψ with few non-zero elements α : D. x =. j. j. j=1. =. ,. =(. 1, · · · ,. N ) ⇥ R D. D. Ψ can be a ONB (e.g. Fourier, wavelets) or a dictionary (atoms). ‣. f. atom. Non-linear approximation # “atoms”. ELEN. improved quality. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 17.
(18) In summary: if “information” .... 2.4. Transformée continue en ondelettes sur la sphère. avec ψ̂a (l, m) = ⟨Ylm | ψa ⟩ la transformée en harmonique sphérique 12 de ψa = Da ψ. Une condition plus simple à manipuler et presque équivalente à (2.65) est d’i que [Van98] ! ψ(θ, ϕ) = 0, dµ(θ, ϕ) 1 + cos θ S2. ... in a signal x. R. N. condition homologue à l’annulation de la moyenne des ondelettes planes. En remarquant que !. (e.g. N = pixel number, voxels, graph nodes, ...) dµ(θ, ϕ). S2. Da ψ(θ, ϕ) 1 + cos θ. =. a. !. dµ(θ, ϕ). S2. ψ(θ, ϕ) , 1 + cos θ. la condition (2.66) permet de créer toute une classe d’ondelettes admissibles de la ψ(θ, ϕ) = φ(θ, ϕ) − α1 Dα φ(θ, ϕ),. there exists a “sparsity” basis (e.g. wavelets, Fourier, ...). pour une certaine fonction φ ∈ L2 (S 2 ).. =(. 1,. ··· ,. D). ⇥R. N D. where x has a linear representation. Fig. 2.3 – L’ondelette DOG pour α = 1.25 dilatée de a = 0.1.. 50. " # En particulier, pour φ = exp − tan2 ( 12 θ) , c.-à-d. la projection stéréographique de la gaussienne sur la sphère, nous obtenons l’ondelette sphérique DOG 13. 100. D. x =. j. and. j. =. j=1. k↵k0 := #{i : ↵i 6= 0} ⌧ N. or. " # ψ(θ, ϕ) = exp − tan2 ( 21 θ) −. 50. 150 100. 200. 1 1 λ(α, θ) 2 α. " exp −. 1 α2. # tan2 ( 12 θ) ,. dont une représentation dilatée d’un facteur a = 0.1 est donnée sur la Figure 2.3. 12 13. 150. Nommée également transformée de Fourier sur S 2 . Pour Difference of Gaussians.. 250 50. 100. 150. 200. 250. 200. 250 50. k↵. 100. 150. 200. 250. ↵K k ⌧ k↵k. Counterexample: Noise! ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 18.
(19) In summary: if “information” .... 2.4. Transformée continue en ondelettes sur la sphère. avec ψ̂a (l, m) = ⟨Ylm | ψa ⟩ la transformée en harmonique sphérique 12 de ψa = Da ψ. Une condition plus simple à manipuler et presque équivalente à (2.65) est d’i que [Van98] ! ψ(θ, ϕ) = 0, dµ(θ, ϕ) 1 + cos θ S2. ... in a signal x. R. N. condition homologue à l’annulation de la moyenne des ondelettes planes. En remarquant que !. (e.g. N = pixel number, voxels, graph nodes, ...) dµ(θ, ϕ). S2. Da ψ(θ, ϕ) 1 + cos θ. =. a. !. dµ(θ, ϕ). S2. ψ(θ, ϕ) , 1 + cos θ. la condition (2.66) permet de créer toute une classe d’ondelettes admissibles de la ψ(θ, ϕ) = φ(θ, ϕ) − α1 Dα φ(θ, ϕ),. there exists a “sparsity” basis (e.g. wavelets, Fourier, ...). pour une certaine fonction φ ∈ L2 (S 2 ).. =(. 1,. ··· ,. D). ⇥R. N D. where x has a linear representation. Fig. 2.3 – L’ondelette DOG pour α = 1.25 dilatée de a = 0.1.. 50. " # En particulier, pour φ = exp − tan2 ( 12 θ) , c.-à-d. la projection stéréographique de la gaussienne sur la sphère, nous obtenons l’ondelette sphérique DOG 13. 100. D. x =. j. and. or. Counterexample: Noise! ELEN. j. =. 100. 200. 1 1 λ(α, θ) 2 α. " exp −. 1 α2. # tan2 ( 12 θ) ,. dont une représentation dilatée d’un facteur a = 0.1 est donnée sur la Figure 2.3. 12 13. 150. Nommée également transformée de Fourier sur S 2 . Pour Difference of Gaussians.. 250 50. 100. 150. 200. 250. 200. j=1. k↵k0 := #{i : ↵i 6= 0} ⌧ N. " # ψ(θ, ϕ) = exp − tan2 ( 21 θ) −. 50. 150. 250 50. k↵. ↵K k ⌧ k↵k. 100. 150. 200. 250. Small `1 -norm. or. k↵k1 =. P. i. |↵i |. Convex! (see after). Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 19.
(20) Other “informative” models. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 20.
(21) Other “informative” models ‣. Structured sparsity for high-dimensional data e3 = t,. or z. Consider a data volume (3-D or more) (e.g. video, hyperspectral data, medical data). x. Possible models: ‣ 3-D sparsity basis (see before) (sometimes costly, sometimes @ ) ‣ or structured sparsity. e1. e2. ELEN. idea: consecutive “slices” vary “slowly”. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 21.
(22) Other “informative” models ‣. Structured sparsity for high-dimensional data e3 = 2-D. x. close coefficient supports!. 2-D. e1. e2. Hyperspectral slices. Hyperspectral sparsity coeff.. Hyperspectral data cube. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 22.
(23) Other “informative” models Structured sparsity for high-dimensional data. ‣. e3 =. ↵. e s r a. sp. $k. not sparse. correlated & not sparse. `2 -norm. Spatially Sparse. e1. e2. e1. $i. `1 -norm. e2. $j. Hyperspectral sparsity coeff. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 23.
(24) Other “informative” models Structured sparsity for high-dimensional data. ‣. e3 =. ↵. e s r a. sp. $k. not sparse. correlated & not sparse. `2 -norm. Spatially Sparse. e1. e2 Hyperspectral sparsity coeff. ELEN. e1. $i. `1 -norm. e2. $j. ) small `2,1 -norm P P k↵k2,1 = ij ( k |↵i,j,k |2 )1/2. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 24.
(25) Other “informative” models ‣. Low-rank models in high dimensions e1 Example: Video. x. (PETS DB). e3 = t e2 Moving foreground Static background. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 25.
(26) Other “informative” models ‣. Low-rank models in high dimensions e1 Example:. x. Video (PETS DB). e3 = t e2. ···. space. X. 1 frame = 1 vector. time ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 26.
(27) Other “informative” models ‣. Low-rank models in high dimensions e1 Example:. x. Video (PETS DB). e3 = t e2. ···. space. X. background. foreground background. time ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 27.
(28) Other “informative” models ‣. Low-rank models in high dimensions e1 Example:. x. Video (PETS DB). e3 = t e2. ···. space. X. Low-rank matrix: Many columns are (almost) linear combination of others. time ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 28.
(29) Other “informative” models ‣. Low-rank models in high dimensions e1 Example:. x. Video (PETS DB). e3 = t e2. ···. space. X. Low-rank matrix: Many columns are (almost) linear combination of others Given. time ELEN. X = U ⌃V ⇤. (SVD). ⌃ = diag( 1 , · · · , N ) small k k0 , or small k k1 = kXk⇤. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 29.
(30) General Sparsity Applications 1. Data Compression/Transmission (by definition); 2. Data restoration : ‣. Denoising,. ‣. Debluring,. ‣. Inpainting, .... 3. Simplified model and interpretation (e.g., in ML). ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 30.
(31) General Sparsity Applications 1. Data Compression/Transmission (by definition); 2. Data restoration : ‣. Denoising,. ‣. Debluring,. ‣. Inpainting, .... 3. Simplified model and interpretation (e.g., in ML) More generally, For regularizing (stabilizing) inverse problems + Impact on data sampling philosophy ! (see after). ELEN. e.g., in Ivo’s talk. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 31.
(32) First Part: ‣. Sparsity, low-rankness and relatives: “From information to structures”. ‣. Compress while you sample: “From structure to scrambled sensing”. ‣. and Reconstruct! “From scrambled sensing to information”. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 32.
(33) Sampling with Sparsity ‣. Paradigm shift:. “Computer readable” sensing + prior information (structures). ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 33.
(34) Sampling with Sparsity ‣. Paradigm shift:. “Computer readable” sensing + prior information (structures) World. Signal. Sensing Device Sensing. Signal. Human. Optimized setup: sampling rate / information. ‣. Examples: Radio-Interferometry, Compressed Sensing, MRI, Deflectometry, Seismology, ... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 34.
(35) Sampling with Sparsity but ... non-linear reconstruction schemes! Regularized inverse problems:. Reconstruct x 2 RN from y = Sensing(x) 2 RM given a sparse model on x.. Examples: Tomography, frequency/partial observations, ... ⇤. x = argmin S(u) s.t. Sensing(u) ⇡ Sensing(x) u2RN. Sparsity metric: e.g., small S(↵) = k↵k1 if u = ↵, small Total Variation S(u) = kruk ELEN. Noise: Gaussian, Poisson, .... Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 35.
(36) Compressed Sensing. CS in a nutshell:. “Forget” Dirac, forget Nyquist, ask few (linear) questions about your informative (sparse) signal, and recover it differently (non-linearly)” ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 36.
(37) Compressed Sensing M questions. y. Sensing method. Signal. x. 0. =. 0 0 0. CS in a nutshell:. M. ELEN. (. = Id). 0 0 0 0. M ⇥N. “Forget” Dirac, forget Nyquist, ask few (linear) questions about your informative (sparse) signal, and recover it differently (non-linearly)”. Sparsity Prior. 0. N. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 37.
(38) Compressed Sensing Low complexity signal. y. x =. M. M ⇥N. If x is K-sparse and if well “conditioned” then: x⇤ = arg min kuk0 s.t. y = u u 2 RN. kuk0 = #{j : uj 6= 0} ELEN. N. Non-linear reconstruction. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 38.
(39) Compressed Sensing. y. x = M ⇥N. M. If x is K-sparse and if well “conditioned” then: (relax.) x⇤ = arg min kuk s.t. y = u kuk1 =. P. ELEN. j. u 2 RN. |uj |. 0. N. 1. (Basis Pursuit). [Chen, Donoho, Saunders, 1998]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 39.
(40) Compressed Sensing Restricted Isometry Property 9 2 (0, 1) p p 1 kvk2 6 k vk2 6 1 + kvk2 for all 2K sparse signals v.. any subset of 2K columns is an isometry. If x is K-sparse and if well “conditioned” then: (relax.) x⇤ = arg min kuk s.t. y = u kuk1 =. P. ELEN. j. u 2 RN. |uj |. 0. 1. if. <. ⇥. 2. 1. (Basis Pursuit). [Candes 08]. [Chen, Donoho, Saunders, 1998]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 40.
(41) Compressed Sensing Examples:. Restricted Isometry Property 9 2 (0, 1) p p 1 kvk2 6 k vk2 6 1 + kvk2. + + + +. for all 2K sparse signals v.. any subset of 2K columns is an isometry. M = O(K ln N/K) ⌧ N M ⇥N 2R , ij ⇠iid N (0, 1). If x is K-sparse and if well “conditioned” then: (relax.) x⇤ = arg min kuk s.t. y = u kuk1 =. P. ELEN. j. u 2 RN. |uj |. 0. 1. Gaussian Bernoulli Random Fourier ..... if. <. ⇥. 2. 1. (Basis Pursuit). [Candes 08]. [Chen, Donoho, Saunders, 1998]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 41.
(42) Compressed Sensing. e2. R2. x = xBP xLS. Φu = y. e1. ℓ1 -ball ℓ2 -ball. If x is K-sparse and if well “conditioned” Solvers: Linear Programming, then: (relax.) Interior Point Method, Proximal Methods, x⇤ = arg min kuk s.t. y = u kuk1 =. P. ELEN. j. u 2 RN. |uj |. 0. 1. ... Tons of toolboxes .... (Basis Pursuit). [Chen, Donoho, Saunders, 1998]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 42.
(43) Compressed Sensing. e2. R2. x = xBP xLS. Φu = y. e1. ℓ1 -ball. Robustness: vs sparse deviation + noise.. kx. ℓ2 -ball. x k 6 C p1K kx ⇤. xK k1 + D✏. If x is K-sparse and if well “conditioned” Solvers: Linear Programming, ky uk 6 ✏ then: (relax.) Interior Point Method, Proximal Methods, x⇤ = arg min kuk s.t. y = u kuk1 =. P. ELEN. j. u 2 RN. |uj |. 0. 1. ... Tons of toolboxes .... (Basis Pursuit). [Chen, Donoho, Saunders, 1998]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 43.
(44) ... in summary, CS is M questions. Sensing method. y. Signal. x. M = O(K log N/K). x. =. or x ' x ⇤. SENSOR. 0 0 0. if noise, quantization, non-linearities. M. HEAVY RECONSTRUCTION. M ⇥N. 0 0 0 0. Random is ok!. LIGHT SENSING. Ask few (linear) questions about your informative (sparse) signal, and recover it differently (non-linearly)” ELEN. 0. 0. N Candès, Romberg, Tao, 2006 Donoho, 2006 1-Pixel Camera, Wakin et al., 2006. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 44.
(45) First Part: ‣. Sparsity, low-rankness and relatives: “From information to structures”. ‣. Compress while you sample: “From structure to scrambled sensing”. ‣. and Reconstruct! “From scrambled sensing to information” (very broad and active field ... just one slide) ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 45.
(46) Reconstruct? ‣. For solving:. (just one slide) e.g., sparsity, low-rank, TV, .... e.g., L2/L1 distance, robust to Gaussian/Poisson noise, .... ⇤. x = argmin S(u) s.t. Sensing(u) ⇡ Sensing(x) u2RN. many possibilities/solvers .... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 46.
(47) Reconstruct? ‣. For solving:. e.g., sparsity, low-rank, TV, .... e.g., L2/L1 distance, robust to Gaussian/Poisson noise, .... ⇤. x = argmin S(u) s.t. Sensing(u) ⇡ Sensing(x) u2RN. many possibilities/solvers .... e2. R2. x = xBP xLS. Φu = y. ‣. ‣. Convex optimization: tons of toolboxes ‣. SPGL1, L1Magic, (F)ISTA, ADMM, .... ‣. Proximal algorithms (see also B. Goldluecke’s part). e1. ℓ1 -ball ℓ2 -ball. Iterative (greedy) methods: ‣. matching pursuit and relatives (OMP). ‣. iterative hard thresholding, CoSAMP, SP, smoothed L0, .... ‣. Approximate Message Passing Algorithms, Bayesian, .... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 47.
(48) Second Part: ‣. ‣. Compressive imaging appetizer: The Rice single pixel camera Other case studies: ‣ Radio-interferometry and aperture synthesis ‣ Hyperspectral CASSI imaging ‣ Highspeed Coded Strobing Imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 48.
(49) Second Part: ‣. ‣. Compressive imaging appetizer: The Rice single pixel camera Other case studies: ‣ Radio-interferometry and aperture synthesis ‣ Hyperspectral CASSI imaging ‣ Highspeed Coded Strobing Imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 49.
(50) Rice Single-pixel Camera x. ELEN. [Duarte, Davenport, Takbar, Laska, Sun, Kelly, Baraniuk, 2008]. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 50.
(51) Rice Single-pixel Camera x. [Duarte, Davenport, Takbar, Laska, Sun, Kelly, Baraniuk, 2008]. 'ji xi , 'ji 2 {0, 1}. j th random pattern 'j 2 RN. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 51.
(52) Rice Single-pixel Camera x. [Duarte, Davenport, Takbar, Laska, Sun, Kelly, Baraniuk, 2008]. yj 0. 1. y1 B y2 C B C y=B . C @ .. A yM. yj 'ji xi , 'ji 2 {0, 1}. j th random pattern 'j 2 RN. ELEN. Optical sum. yj =. P. i. 'ji xi. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 52.
(53) Rice Single-pixel Camera x. [Duarte, Davenport, Takbar, Laska, Sun, Kelly, Baraniuk, 2008]. yj 0. sparse in. yj x⇤ = argmin k↵k1 s.t. M=2%. ELEN. y1 B y2 C B C y=B . C @ .. A yM. (e.g., wavelets). orig. 1. ↵=y M=5%. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 53.
(54) Rice Single-pixel Camera x. [Duarte, Davenport, Takbar, Laska, Sun, Kelly, Baraniuk, 2008]. yj 0. sparse in. yj x⇤ = argmin k↵k1 s.t. M=2%. ELEN. y1 B y2 C B C y=B . C @ .. A yM. (e.g., wavelets). orig. 1. ↵=y M=5%. Proof of concept but PD is unique! Specific imaging application: high energy photons with single costly photomultiplier. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 54.
(55) Second Part: ‣. ‣. Compressive imaging appetizer: The Rice single pixel camera Other case studies: ‣ Radio-interferometry and aperture synthesis ‣ Hyperspectral CASSI imaging ‣ Highspeed Coded Strobing Imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 55.
(56) Aperture Synthesis in Radio-Astronomy Observation of the sky in radio frequency. Astronomical radio sources. ⇥. I⇥ (x, y) Planar approximation. E1 (⇧ , t). ELEN. E2 (⇧ , t). Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 56.
(57) Aperture Synthesis in Radio-Astronomy Observation of the sky in radio frequency. Astronomical radio sources. ⇥. I⇥ (x, y) Planar approximation. E1 (⇧ , t). E2 (⇧ , t). “ Imagine a swimming pool with two swimmers, and you want to detect their positions from the waves they produce ... ”. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 57.
(58) Aperture Synthesis in Radio-Astronomy Observation of the sky in radio frequency. Astronomical radio sources. ⇥ ⇥ B. I⇥ (x, y) Planar approximation. E1 (⇧ , t). E2 (⇧ , t). ⇥ B baseline. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 58.
(59) Aperture Synthesis in Radio-Astronomy Observation of the sky in radio frequency. Astronomical radio sources. ⇥ ⇥ B. I⇥ (x, y) Planar approximation. E1 (⇧ , t). Time correlation :. ⇤ ) Iˆ⇥ (B ELEN. E2 (⇧ , t). Van Cittert Zernike Theorem :. ⇥ B baseline. ⇥ ˆ ⌅ I⇤ (B ) = E1 E2 ⇥. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. t. 59.
(60) Aperture Synthesis in Radio-Astronomy. Ac rg er ts a te s a les co pe. Observation of the sky in radio frequency. Astronomical radio sources. ⇥ ⇥ B. la. I⇥ (x, y) Planar approximation. E1 (⇧ , t). Time correlation :. ⇤ ) Iˆ⇥ (B ELEN. E2 (⇧ , t). Van Cittert Zernike Theorem :. ⇥ B baseline. ⇥ ˆ ⌅ I⇤ (B ) = E1 E2 ⇥. t. Fourier transform. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 60.
(61) E pur si muove!. “And yet it moves!” ⇥. ‣. using N telescopes,. ‣. and baselines undergo Earth rotation !. ELEN. N 2. possible Fourier observations. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 61.
(62) E pur si muove!. “And yet it moves!” ⇥. ‣. using N telescopes,. ‣. and baselines undergo Earth rotation !. ‣. Example :. Fourier domain. N 2. possible Fourier observations. v. 18 120m. Each telescope pair = 1 elliptic path. u Earth rotation. ELEN. Arcminute Microkelvin Imager AMI. 1 observation = 1 telescope pair at 1 instant. 4 - 20m. Other configurations : ‣ Very Large Array VLA ‣ Square Kilometer Array SKA. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 62.
(63) E pur si muove!. “And yet it moves!” ⇥. ‣. using N telescopes,. ‣. and baselines undergo Earth rotation !. ‣. Example :. N 2. possible Fourier observations. Partial Fourier Sampling Problem!. y = SF x + n Selection of a few frequencies. ELEN. 2-D Fourier Transform. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 63.
(64) E pur si muove!. “And yet it moves!” ⇥. ‣. using N telescopes,. ‣. and baselines undergo Earth rotation !. ‣. Example :. N 2. possible Fourier observations. Partial Fourier Sampling Problem!. y = SF x + n Selection of a few frequencies. 2-D Fourier Transform. Image prior: SPARSITY! + Additional improvements: multiple sparsity basis, reweighted L1, ... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 64.
(65) Reconstruction results Arcminute Microkelvin Imager(AMI). M31. Fourier Inversion. R. Carrillo, J. McEwen, Y. Wiaux, “PURIFY: a new approach to radio-interferometric imaging”, Accepted MNRAS, 2014. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 65.
(66) Reconstruction results Arcminute Microkelvin Imager(AMI). M31. Fourier Inversion. Sparsity Averaging Reweighted Analysis (SARA). reconstruction (log). residual. R. Carrillo, J. McEwen, Y. Wiaux, “PURIFY: a new approach to radio-interferometric imaging”, Accepted MNRAS, 2014. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 66.
(67) Second Part: ‣. ‣. Compressive imaging appetizer: The Rice single pixel camera Other case studies: ‣ Radio-interferometry and aperture synthesis ‣ Hyperspectral CASSI imaging ‣ Highspeed Coded Strobing Imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 67.
(68) Hyperspectral imaging ‣. Fusion of spectrometry and imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 68.
(69) Hyperspectral imaging ‣. Fusion of spectrometry and imaging. ‣. Applications: ‣. material classification/segmentation. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 69.
(70) Hyperspectral imaging ‣. Fusion of spectrometry and imaging. ‣. Applications: ‣. material classification/segmentation. ‣. microscopy/spectroscopy. ‣. counterfeit detection. ‣. environmental monitoring. ‣. skin decease detection. ‣. .... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 70.
(71) How is it usually done? Single filtering. Multiplexed filtering. Line scanning ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 71.
(72) How is it usually done?. Issues: ‣. acquisition time is slow. ‣. low spatial/spectral/temporal resolution (depending on selected sensing). ‣. Huge amount of data at sensing. ‣. But “low complexity” (sparse/low-rank) signals ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 72.
(73) Compressive HS imaging ‣. high-dimensional data = natural field for CS!. ‣. Coded Aperture Snapshot Spectral Imaging (CASSI) Mixing dispersive element + coded aperture. ‣ e1. x ri g r. to c e et. D. e2. HS cube. d. Random Coded Coded HS cube Aperture ELEN. Dispersive Element. Coded & Sheered Cube. Spectral Integration. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 73.
(74) Compressive HS imaging ‣. high-dimensional data = natural field for CS!. ‣. Coded Aperture Snapshot Spectral Imaging (CASSI) Mixing dispersive element + coded aperture. ‣ e1. x. if line scanning!. =. ri g r. to c e et. d. D. e2. y = DSCx. HS cube. Random Coded Coded HS cube Aperture ELEN. Dispersive Element. Coded & Sheered Cube. = Spectral Integration. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. x. 74.
(75) Compressive HS imaging ‣. high-dimensional data = natural field for CS!. ‣. Coded Aperture Snapshot Spectral Imaging (CASSI) Mixing dispersive element + coded aperture. ‣ e1. x ri g r. to c e et. D. e2. x. C=. S. diag(c). Sensing model: ELEN. y = DSCx =. D. y. x. + with multiple c. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. d. yj. 75.
(76) Compressive HS imaging ‣. high-dimensional data = natural field for CS!. ‣. Coded Aperture Snapshot Spectral Imaging (CASSI) ‣. Mixing dispersive element + coded aperture. Optically:. Gonzalo R. Arce, David J. Brady, Lawrence Carin, Henry Arguello, and David S. Kittle, “Compressive Coded Aperture Spectral Imaging”, IEEE Sig. Proc, vol. 1, 2014. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 76.
(77) Compressive HS imaging Reconstruction: solving x = ⇤. T. (arg min ⌧ k↵k1 + ↵. =. 1⌦. 1 2. ky. 2. 1. DSC ↵k ) 2. 2-D wavelet Symmlet-8. {. ‣. 2 DCT Gonzalo R. Arce, David J. Brady, Lawrence Carin, Henry Arguello, and David S. Kittle, “Compressive Coded Aperture Spectral Imaging”, IEEE Sig. Proc, vol. 1, 2014. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 77.
(78) Compressive HS imaging ‣. Reconstruction:. Original (several wav.). With 12 shots and random c. With 12 shots and optimized c. example of optimized c. Gonzalo R. Arce, David J. Brady, Lawrence Carin, Henry Arguello, and David S. Kittle, “Compressive Coded Aperture Spectral Imaging”, IEEE Sig. Proc, vol. 1, 2014. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 78.
(79) Second Part: ‣. ‣. Compressive imaging appetizer: The Rice single pixel camera Other case studies: ‣ Radio-interferometry and aperture synthesis ‣ Hyperspectral CASSI imaging ‣ Highspeed Coded Strobing Imaging. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 79.
(80) High Speed Imaging ‣. Imaging high speed object lead to blurry image if low shutter frequency (source: wikipedia). ‣. But hardware limitation in # fps. ‣. Solution: “Highspeed Coded Strobing Imaging”. (e.g., O(20fps) ). ‣. keep the detector fps rate unchanged. ‣. and add high rate coding of the shutter! (Reddy, Veeraraghavan, Chellappa, ...). ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 80.
(81) Highspeed Coded Strobing Imaging. 8 ⇥ fps!. + superresolution R. Dikpal, A. Veeraraghavan, and R. Chellappa. "P2C2: Programmable pixel compressive camera for high speed imaging" Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEE, 2011.. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 81.
(82) Highspeed Coded Strobing Imaging Optically:. R. Dikpal, A. Veeraraghavan, and R. Chellappa. "P2C2: Programmable pixel compressive camera for high speed imaging" Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEE, 2011.. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 82.
(83) Highspeed Coded Strobing Imaging ‣. Reconstruction: regularized with optical flow. with wavelet prior. + optical flow reg.. R. Dikpal, A. Veeraraghavan, and R. Chellappa. "P2C2: Programmable pixel compressive camera for high speed imaging" Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEE, 2011.. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 83.
(84) Conclusions. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”.
(85) Conclusion Sparsity prior involves new sensing methods: e.g., Compressed Sensing, Compressive Imaging.. Future: More sensing examples:. http://nuit-blanche.blogspot.com. hyperspectral, network, GPR, Lidar, ... (explosion). Better sparsity prior: structured, model-based, mixed-norm (Cevher, Bach, ...) co-sparsity/analysis model (Gribonval, Nam, Davies, Elad, Candes). Non-linear sensing models ? 1-bit CS is one instance, phase recovery (Candès), polychromatic CT, ... ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 85.
(86) Links ‣. (Science 2.0.). Rice CS Resources page: http://www-dsp.rice.edu/cs. ‣. Igor Carron’s “Nuit Blanche” blog: http://nuit-blanche.blogspot.com. 1 CS post/day!. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 86.
(87) Thank you!. ELEN. Laurent Jacques - “Compressive sensing: how to sample data from what you know!”. 87.
(88)
RELATERTE DOKUMENTER
In this study, the potential of single-point NIR spectroscopy, Vis/NIR hyperspectral imaging and RGB imaging was evaluated to proactively and non-destructively detect and
thickness up to 5 cm, salmon fillets with skin and 3 cm thick model samples of ground pork
Detailed analysis first show that an imaging camera system using a single pixel sensor and a simple circular scan mechanism can operate as a photon noise limited system in
The mixel camera solves this problem by combining a hardware component – an array of light mixing chambers – with a mathematical method that restores the hyperspectral data to its
If the HW corrected camera with 0.05 pixel residual keystone is being used to capture a much brighter scene (five times more light), then the misregistration errors caused by
In multi- and hyperspectral imaging, spatial coregistration of the point spread functions (PSFs) of all bands within each pixel is critical for the integrity of measured
Different spectral bands are recorded in succession thanks the platform movement, similar to pushbroom imaging spectrome- ters commonly used for hyperspectral imaging. VSLAM
For multispectral imaging concepts based on patterned filters in the focal plane, the scan motion must be accurately known to ensure spatial coregis- tration of the different
The experiment shows that the accuracy of both interferometry and differential interferometry depends on the measurement geometry, variations in radio refractivity,
Figure 6 - The compressive strength development of an Aplite based cement slurry cured at ambient temperature, approximately 20 °C and 3000 psi.. The compressive strength
This thesis gives a review and an evaluation of synthetic aperture (SA) methods for medical ultrasound imaging, and presents a new synthetic aperture method which increases frame
Key words: deoxynivalenol (DON), Fusarium graminearum, germination capacity, hyperspectral imaging, infection pathways, infection time, kernel infection, Oats (Avena
By means of experiment designing and data analysis, this thesis investigates coded signals and related wave compression methods for cartilage imaging done by a high frequency
Further, the system is imple- mented for single shot, wavelength-independent quantitative phase imaging of human red blood cells (RBCs) with scalable resolution using color CCD
We present plans for adding an interferometric imaging capability to the EISCAT Svalbard Radar (ESR), a capability which will contribute sig- nificantly to several areas of
• The three commercially produced concrete types had the expected compressive strength relative to each other, and the compressive test results always showed higher values than
The combined use of visible/near infrared (Vis/NIR), near infrared (NIR), mid-infrared (MIR) [29,33–35], Raman spectroscopy [36], hyperspectral imaging (HSI) and multispectral
Another example to apply the clustered LASSO based image reconstruction using Bayesian framework to medical images is a functional MRI (fMRI) image.. 5, which is
(a) Average compressive stress-strain curves, and (b) evolution of nominal compressive peak strength, specific compressive fracture energy, and strain at peak stress, after
(vii) We present a possible explanation, formulated as conjecture, to why deep learning is so successful in classification problems, and why neural networks based on deep learning
The proposed coded aperture contains multiple square pinholes, and forms a superposition of multiple pinhole images.. In order to reduce the artefact due to diffraction or
Another method for single sensor light field photography is based on coded aperture [VRA ∗ 07] capturing, where a transpar- ent non-refractive random mask is placed on the
In order to compare the hyperspectral imaging method to other imaging techniques, aligning the different images was necessary. The first task was to align a hyperspectral image of