Talk page

Title:
Optimality of the Johnson-Lindenstrauss lemma

Speaker:
Jelani Nelson

Abstract:
Dimensionality reduction in Euclidean space, as attainable by the Johnson-Lindenstrauss lemma, has been a fundamental tool in algorithm design and machine learning. The JL lemma states that any n point subset of l_2 can be mapped to l_2^m with distortion 1+epsilon, where m = O(epsilon^{-2} log n). In this talk, I discuss our recent proof that the JL lemma is optimal, in the sense that for any n, d, epsilon, where epsilon is not too small, there is a point set X in l_2^d such that any f:X->l_2^m with 1+epsilon must have m = Omega(epsilon^{-2} log n). I will also discuss some subsequent work and future directions. Joint work with Kasper Green Larhus (Aarhus University).

Link:
https://www.msri.org/workshops/811/schedules/23258

Workshop:
MSRI- Geometric functional analysis and applications