# BasicProbability

Top | recent changes | Preferences

#### Basic probability

Example of a random experiment:

Flip a fair coin three times.

Eight possible outcomes: HHH, HHT, HTH, HTT, THH, THT, TTH, TTT

Each outcome has equal probability (1/8).

Let X be the number of heads.

Then the expectation of X is (3+2+2+1+2+1+1+0)/8 = 3/2.

Abstract definitions:

Random experiment is defined by a set of possible outcomes, each with a probability. Probabilities of the outcomes must sum to 1.

A random variable X takes on a value in each outcome of the experiment.

If the random variable is numeric, its expectation E[X] is what we expect the average value of the random variable to tend to if we repeat the experiment many times. Formally, the expectation is the sum, over all outcomes, of the probability of the outcome times the value that the variable takes for that outcome. (Note that the probability of an outcome is the frequency with which we expect the outcome to occur if we repeat the experiment over and over.)

Linearity of expectation: The expectation of a sum of random variables is the sum of their individual expectations.

For example, in the coin-flipping experiment, let X and Y be the number of heads and tails, respectively. Since X+Y=3 in all outcomes, we know E[X+Y]=3. Linearity of expectation implies E[X]+E[Y]=3. And since E[X]=E[Y] (by symmetry), we can conclude E[X]=3/2.

Formally, an event is a subset of the possible outcomes (those outcomes in which the event occurs). The probability of the event is the sum of the probabilities of the outcomes in which the event occurs. For example, the event "the first flip comes up heads" corresponds to the subset {HHH, HHT, HTH, HTT}, and has probability 4/8 = 1/2.

Two events are independent if the probability of that both happen is the product of their individual probabilities. Intuitively, knowing that one event happens does not change your estimation of the probability that the other happens.

Exercise: In a random permutation of 1..N, what is the expected number of fixed points? (Here each permutation represents an outcome, each outcome has probability 1/N!, and a "fixed point" is an item i that occurs in position i in the permutation.)

References:
• Lenstra ([pdf] or [postcript]) - basic background
• Appendix B of Approximation Algorithms by Vazirani
• ProbabilisticMethod?

Top | recent changes | Preferences