Here’s a series of two talks by Avi Wigderson explaining much more about the applications of expanders and also the zigzag construction if you are interested:

http://video.ias.edu/pseudo2010/wigderson1

http://video.ias.edu/pseudo2010/wigderson2

]]>

http://video.ias.edu/csdm/trevisan

The talk is very accessible and does not assume any background in spectral graph theory.

]]>However, the mapping which multiplied a point u by a matrix G, is expensive to compute since G is a matrix in which almost all entries are non-zero. There has been a lot of research in coming up with mappings which can be computed much faster. In particular, one way is to come up with a *sparse* matrix G’, so that the mapping still preserves distances approximately, but the matrix-vector product G’u can be computed much faster.

Here is an excelled talk summarizing many of the recent developments in this area:

]]>Consider an infinite sequence of independent coin tosses of a fair coin and define the following two random variables:

Z1 = Number of coin tosses till the pattern HTT first appears.

Z2 = Number of coin tosses till the pattern HTH first appears.

Then, somewhat counterintuitively, E[Z1] and E[Z2] are not the same! Its a common error to assume that since the probability of 3 tosses appearing in either pattern is 1/8, the expected time of first occurrence is also the same. But the time of first occurrence also depends on wether different occurrences of the string can overlap. Here’s the talk by by Peter Donnelly that I mentioned – it discusses this problem and a bunch of other interesting ones.

As an exercise, you can try using the idea of conditional expectations to compute E[Z1] and E[Z2].

]]>