In that respect, Alexander Barvinok's lecture notes on measure concentration prove to be quite useful. He starts from simple measure concentration on the sphere, builds up to the JL Lemma, and then carries on forward into Levy's theorem, Dvoretsky's theorem and the Brunn-Minkowsky inequality. He starts with a good intuitive definition of measure concentration:
Any "reasonable" function defined on a "large" probability space takes values close to its average with high probabilitywhich nicely ties it together (in my mind) with other large deviation results in information theory as well (he also has a section on rate distortion and Cramer's work).
Another useful nugged from Barvinok's notes is a thorough drilling on the use of the "Laplace trick": the technique of taking the exponential of a random variable and applying the Markov inequality in order to prove tail bounds (the standard proof of the Chernoff bound is one example of this).
"I've been preparing a lecture on the Johnson-Lindenstrauss Lemma"
ReplyDeleteFor what class?
For my computational geometry class actually.
ReplyDeleteThere is also my "own" version of the class notes, which follows Matousek presentation. See here. This also contains the alternative "shorter" proof.
ReplyDelete