To Home Page

Definitions and Properties for Random Variables

Definitions

  1. A random variable is a process for choosing a random number.
     
  2. A discrete random variable is defined by its probability distribution function:
     
    Outcome Probability
    x1 p1
    x2 p2
    xn pn

    The probabilities of a discrete random variable must sum to 1:
     

  3. A continuous random variable is defined by a probability density function p(x), with these properties: p(x) ≥ 0 and the area between the x-axis and the curve is 1:
     

     
  4. The expected value E(x) of a discrete variable is defined as:
     

     
  5. The expected value E(x) of a continuous variable is defined as:
     

     
  6. The variance Var(x) of a random variable is defined as Var(x) = E((x - E(x)2).
     
  7. Two random variables x and y are independent if E(xy) = E(x)E(y).
     
  8. The standard deviation of a random variable is defined by σx = √Var(x).
     
  9. The term standard error is used instead of standard deviation when referring to the sample mean.
     

  10. If x is a normal random variable with parameters μ (center) and σ2 (spread = σ), we write in symbols: x ˜ N(μ, σ2).
     
  11. The sample variance of x1, x2, ..., xn is defined as
  12. If x1, x2, ... , xn are observations from a random sample, the sample standard deviation s is defined as the square root of the variance:
  13. The sample covariance of x1, x2, ..., xn is defined as
     
  14. A random vector is a column vector of random variables. For example:
     

  15. The expected value of a random vector E(v) is defined as the vector of expected values of its components.
    If v = (x1 ... xn)T
     

  16. The covariance matrix Cov(v) of a random vector is the matrix of variances and covariances of its components.
    If v = (x1 ... xn)T, the ijth component of the Cov(v) is sij

 

Properties

For properties 1 to 7, c is a constant; x and y are random variables.

  1. E(x + y) = E(x) + E(y).
     
  2. E(cx) = c E(y).
     
  3. Var(x) = E(x2) - E(x)2
     
  4. If x and y are independent, then Var(x + y) = Var(x) + Var(y).
     
  5. Var(x + c) = Var(x)
     
  6. Var(cx) = c2 Var(x)
     
  7. Cov(x + c, y) = Cov(x, y)
     
  8. Cov(cx, y) = c Cov(x, y)
     
  9. Cov(x, y + c) = Cov(x, y)
     
  10. Cov(x, cy) = c Cov(x, y)
     
  11. If x1, x2, ..., xn are independent and N(μ, σ2), then E(x) = μ. We say that x is unbiased for μ.
     
  12. If x1, x2, ... , xn are independent and N(μ, σ2), then E(s) = σ2.   We say that s is unbiased for σ2.

For properties 8 to 12, w and v are random vectors; b is a constant vector; A is a constant matrix.

  1. E(v + w) = E(v) + E(w)
     
  2. E(b) = b
     
  3. E(Av) = A E(v)
     
  4. Cov(v + b) = Cov(v)
     
  5. Cov(Av) = A Cov(v) AT