A Probabilistic Lemma due to Uri Feige
One of the cute results in STOC is the following probabilistic bound due to Feige.
Let X1, X2, ..., Xn be independent positive random variables with mean 1. Note that we do not make any assumptions about their distribution and hence their variance can be arbitrary and in particular they are not iid. Let X = X1 + X2 + ... + Xn. By linearity of expectation µ, the mean of X, is n.
A typical question in bounding the deviation of sums of random variables is the following.
We have a variety of bounds including the Markov inequality, the Chebyshev inequality and of course the Chernoff-Hoeffding bounds. Of these the Markov inequality is the simplest and pretty much makes no assumptions other than that the rv is non-negative:
Phrasing it in a different form, Pr[X <= t µ] >= 1-1/t.
Cheybyshev inequality is useful when we have a bound on the variance of X but in our case we don't. Chernoff bounds are the most useful for sums of bounded independent random variables. Feige's bound is quite different. He showed that Pr[X <= n+1] > 1/13. In other words there is a constant probability that X deviates by an additive 1 from the expected value. Note that the Markov inequality would give a bound of 1/(n+1) which is very far from the truth in this case.
Feige uses the above bound for an application to estimate the density of a graph but it is not the most compelling one (as he himself states). He anticipates more interesting applications in the future.
An Open Problem:
The bound Feige shows is 1/13 but the worst case example he knows has a bound of 1/e where e is the base of the natural logarithm. It is the obvious one: Each Xi takes on a value of (n+1) with probability 1/(n+1) and 0 otherwise. Therefore the probability that X is at most n+1 tends to 1/e. Feige conjectures that this the right bound. The problem is all yours.
No comments:
Post a Comment