To Home Page
Definitions and Properties for Random Variables
Definitions
- A random variable is a process for choosing a random number.
- A discrete random variable is defined by its probability
distribution function:
Outcome | Probability |
x1 | p1 |
x2 | p2 |
⋮ | ⋮ |
xn | pn |
The probabilities of a discrete random variable must sum to 1:
- A continuous random variable is defined by a probability density function p(x),
with these properties: p(x) ≥ 0 and the area between the x-axis and the curve is 1:
- The expected value E(x) of a discrete variable is defined as:
- The expected value E(x) of a continuous variable is defined as:
- The variance Var(x) of a random variable is defined as
Var(x) = E((x - E(x)2).
- Two random variables x and y are independent if E(xy) = E(x)E(y).
- The standard deviation of a random variable
is defined by σx = √Var(x).
- The term standard error is used instead of standard deviation when referring to
the sample mean.
- If x is a normal random variable with parameters μ (center) and σ2
(spread = σ), we write in symbols: x ˜ N(μ, σ2).
- The sample variance of x1, x2, ..., xn
is defined as
sx2 = |
(x1 -
x)2 + ... +
(xn -
x)2 |
|
n - 1 |
|
- If x1, x2, ... , xn
are observations from a random sample,
the sample standard deviation s is defined as the square root of the variance:
sx = |
√ |
|
(x1 -
x)2 + ... +
(xn -
x)2 |
|
n - 1 |
|
- The sample covariance of x1, x2, ..., xn
is defined as
sxy = |
(x1 - x)(y1 - y)
+ ... + (xn - x)(yn - y) |
|
n - 1 |
|
- A random vector is a column vector of random variables. For example:
- The expected value of a random vector E(v) is defined as the vector of expected values of its components.
If v = (x1 ... xn)T
E(v) = (E(x1) ... E(xn))T
- The covariance matrix Cov(v) of a random vector is the matrix of variances and covariances of its components.
If v = (x1 ... xn)T, the ijth component
of the Cov(v) is sij
Properties
For properties 1 to 7, c is a constant; x and y are random variables.
- E(x + y) = E(x) + E(y).
- E(cx) = c E(y).
- Var(x) = E(x2) - E(x)2
- If x and y are independent, then Var(x + y) = Var(x) + Var(y).
- Var(x + c) = Var(x)
- Var(cx) = c2 Var(x)
- Cov(x + c, y) = Cov(x, y)
- Cov(cx, y) = c Cov(x, y)
- Cov(x, y + c) = Cov(x, y)
- Cov(x, cy) = c Cov(x, y)
- If x1, x2, ..., xn
are independent and N(μ, σ2),
then E(x) = μ. We say that x is unbiased for μ.
- If x1, x2, ... , xn are independent and
N(μ, σ2),
then E(s) = σ2. We say that s is unbiased
for σ2.
For properties 8 to 12, w and v are random vectors;
b is a constant vector; A is a constant matrix.
- E(v + w) = E(v) + E(w)
- E(b) = b
- E(Av) = A E(v)
- Cov(v + b) = Cov(v)
- Cov(Av) = A Cov(v) AT