Engineering Analysis/Expectation and Entropy

From Wikibooks, open books for an open world
Jump to: navigation, search


The expectation operator of a random variable is defined as:

E[x] = \int_{-\infty}^\infty x f_X(x)dx

This operator is very useful, and we can use it to derive the moments of the random variable.


A moment is a value that contains some information about the random variable. The n-moment of a random variable is defined as:

E[x^n] = \int_{-\infty}^\infty x^n f_X(x)dx


The mean value, or the "average value" of a random variable is defined as the first moment of the random variable:

E[x] = \mu_X = \int_{-\infty}^\infty x f_X(x)dx

We will use the Greek letter μ to denote the mean of a random variable.

Central Moments[edit]

A central moment is similar to a moment, but it is also dependant on the mean of the random variable:

E[(x - \mu_X)^n] = \int_{-\infty}^\infty (x - \mu_X)^n f_X(x)dx

The first central moment is always zero.


The variance of a random variable is defined as the second central moment:

E[(x - \mu_X)^2] = \sigma^2

The square-root of the variance, σ, is known as the standard-deviation of the random variable

Mean and Variance[edit]

the mean and variance of a random variable can be related by:

\sigma^2 = \mu^2 + E[x^2]

This is an important function, and we will use it later.


the entropy of a random variable  X is defined as:

H[X]= E \left[ \frac{1}{p(X)} \right]