Random Variable

Given an experiment with sample space , a random variable (r.v.) is function from the sample space to the real numbers . It is common, but not required, to denote random variables by capital letters.

Thus, a random variable assigns a number value to each possible outcome of the experiment. The mapping itself is deterministic.

For example, consider an experiment where we toss a fair coin twice. The sample space consists of four possible outcomes . Then we can define be the number of Heads:

Discrete Random Variable

A random variable is said to be discrete if there is a finite list of values or an infinite list of values such that . If is a discrete r.v., then the finite or countable infinite set of values such that is called the support of

In contrast, a continuous r.v. can take on any real value in an interval (possibly even the entire real line)

Info

It is also possible to have an r.v. that is a hybrid of discrete and continuous, such as by flipping a coin and then generating a discrete r.v. if the coin lands Heads and generating a continuous r.v. if the coin lands on Tails.

Indicator Random Variable

The indicator random variable of an event is the r.v. which equals if occurs and otherwise. We will denote the indicator r.v. of by or

Probability Mass Function

The Probability Mass Function (PMF) of a discrete r.v. is the function given by . Note that this is positive if is in the support of , and otherwise

Tip

In writing , we are using to denote an event, consisting of all outcomes to which assigns to the number

Let be a discrete r.v. with support , the PMF of must satisfy the following two criteria:

  • Nonnegative: if for some , and otherwise;
  • Sums to :

Cumulative Distribution Function

The cumulative distribution function (CDF) of an r.v. is the function given by . When there is no risk of ambiguity, we sometimes drop the subscript and just write for a CDF.

Any CDF has the following properties:

  1. Increasing:
  2. Right-continuous: wherever there is a jump, the CDF is continuous from the right. That is, for any , we have
  1. Convergence to and in the limits:

Tip

We often say the distribution function of a discrete r.v. is its PMF, and the distribution function of a continuous r.v. is its CDF

Functions of Random Variables

For an experiment with sample space , an r.v. , and a function . is the r.v. that maps to for all

Let be a discrete r.v. and . Then the PMF of is

as for the continuous case:

Example

For a continuous r.v. where is some CDF, then , and represents the uniform distribution We can prove this by converting the CDF of to the CDF of , that is, Here exists since is continuous and strictly increasing (the property of CDF) See the universality of the uniform

Independence of Random Variables

Random variables and are said to be independent if

for all In the discrete cases, this is equivalent to the condition

for all with in the support of and in the support of

Random variables are independent if

for all . For infinitely many r.v.s, we say that they are independent if every finite subset of the r.b.s is independent.

Tip

Note that this criteria is different from that for independence of events. But in fact, if are independent, then they are also pairwise independent, i.e., is independent of for . The idea behind proving that and are independent is to let all the other than go to in the definition of independence, since we already know is true. But pairwise independence does not imply independence in general.

If and are independent r.v.s, then any function of is independent of any function of

Independent and Identically Distributed (i.i.d.)

We often work with random variables that are independent and have the same distribution. We call such r.v. independent and identically distributed, or i.i.d. for short.

If some r.v.s. are i.i.d., they provide no information about each other and have the same PMF and CDF