EPFL

Algo+LMA

This shows you the differences between two versions of the page.

— |
en:group:seminars:tmp:20081001 [2016/06/23 11:26] (current) |
||
---|---|---|---|

Line 1: | Line 1: | ||

+ | |||

+ | ---- dataentry seminar ---- | ||

+ | date_dt : 2008.10.01 | ||

+ | title : Expander codes, Explicit Euclidean sections, and compressed sensing | ||

+ | speaker : Prof. Venkatesan Guruswami | ||

+ | affiliation : University of Washington & Carnegie Mellon University | ||

+ | time : 16h15-17h15 | ||

+ | room : BC129 | ||

+ | table : seminars | ||

+ | =================== | ||

+ | template:datatemplates:seminar | ||

+ | ----------------------- | ||

+ | |||

+ | |||

+ | ==== abstract ==== | ||

+ | Classical results from the 1970's state that a random subspace of R^N of dimension N/2 has "distortion" O(1) w.h.p where the distortion is measured by sqrt{N} times the largest ratio of the L_2 to L_1 norms of a vector in the subspace. (So O(1) distortion means each vector in the subspace has its mass spread nearly evenly over a constant fraction of the coordinates.) This result is a particular example of the use of the probabilistic method, a technique which is ubiquitous in asymptotic convex geometry. Similar randomized geometric constructions arise in areas like dimensionality reduction and compressed sensing. Explicit constructions of such objects , however, seem very hard to come by. We will describe constructions inspired by expander codes that can be used to produce subspaces with distortion nearly as good as a random subspace either explicitly or with sublinear randomness. Specifically, we construct an explicit subspace of R^N with dimension N(1-o(1)) and distortion (slightly worse than) poly(log N). The construction combines various ingredients such as Ramanujan graphs, sum-product theorems in finite fields, and Kerdock codes. We are also able to construct subspaces of R^N of dimension N/2 with O(1) distortion using N^d random bits for any constant d > 0 (and hence deterministically in sub-exponential time). The talk will also briefly discuss connections to sparse signal recovery (aka compressed sensing). [Based on joint works with James Lee, Alexander Razborov, and Avi Wigderson] | ||