Probabilistic Method

A powerful tool especially for attacking existence problems

By Vamshi Jandhyala in mathematics

March 14, 2020

$$ \newcommand\EE{\mathbb E} \newcommand\PP{\mathbb P} $$

Existence Proofs using Probabilistic Method

The typical two step strategy of the probabilistic method in the context of existence problems is as follows,

  1. Define a random object from an appropriate probability space.
  2. Show that the random object satisfies the desired properties with probability greater than $0$.

Together, these two steps prove existence.

Other ways of establishing existence by establishing a nonzero probability are given below.

Union Bound

Consider several events $A_1$, $A_2, \dots, A_k$. If

$\PP(A_1) + \PP(A_2) + \dots + \PP(A_k) < 1$.

then there is a nonzero probability that none of the events occur.

Markov’s Inequality

Let $X$ be a random variable taking only nonnegative values.

Suppose $\EE[X] = c$. Then $ \PP(X \ge rc) \le \frac{1}{r}$.

The inequality is also sometimes called Chebyshev’s inequality.

Expectation Lemma

Another very useful strategy is to use the following lemma known as the Expectation lemma.

Let $X$ be a random variable.

Then there is some point in the probability space where $X \geq \EE[X]$, and also some point in the probability space where $X \leq \EE[X]$.

Lov'asz Local Lemma (LLL)

The Lov'asz Local Lemma (abbreviated LLL) is in some sense a refinement of the union bound idea – if the events in question are mostly independent, then the probability no events occur is still nonzero.

We present below the symmetric version of the Local Lemma. An asymmetric version also exists (see Wikipedia).

Consider several events, each occurring with probability at most $p$, and such that each event is independent of all the others except at most $d$ of them.

Then if $epd \le 1$ the probability that no events occur is positive.

Note that we don’t use the number of events, only the number of dependencies.

As the name implies, the local lemma is useful in situations where in a random algorithm, it appears that things do not depend much on each other.

Linearity of Expectation

The crux result of this section is the following theorem.This theorem is obvious if the $X_1, X_2, \dots , X_n$ are independent of each other.

The wonderful thing is that this holds even if the variables are not independent.

In other words,linearity of expectation lets us only focus on small, local components when computing an expected value, without having to think about why it works.

Given any random variables $X_1$, $X_2$, \dots, $X_n$, we always have

$ \EE[X_1 + X_2 + \dots + X_n = \EE[X_1] + \EE[X_2] + \dots + \EE[X_n]$.

References

  1. Ravi B’s collection of problems.
  2. Evan Chen’s handout on the probabilistic method
  3. Po-shen Loh’s handout on the probabilistic method