# Engineering Analysis/Expectation and Entropy

## Expectation

The expectation operator of a random variable is defined as:

$E[x]=\int _{-\infty }^{\infty }xf_{X}(x)dx$ This operator is very useful, and we can use it to derive the moments of the random variable.

## Moments

A moment is a value that contains some information about the random variable. The n-moment of a random variable is defined as:

$E[x^{n}]=\int _{-\infty }^{\infty }x^{n}f_{X}(x)dx$ ### Mean

The mean value, or the "average value" of a random variable is defined as the first moment of the random variable:

$E[x]=\mu _{X}=\int _{-\infty }^{\infty }xf_{X}(x)dx$ We will use the Greek letter μ to denote the mean of a random variable.

## Central Moments

A central moment is similar to a moment, but it is also dependent on the mean of the random variable:

$E[(x-\mu _{X})^{n}]=\int _{-\infty }^{\infty }(x-\mu _{X})^{n}f_{X}(x)dx$ The first central moment is always zero.

### Variance

The variance of a random variable is defined as the second central moment:

$E[(x-\mu _{X})^{2}]=\sigma ^{2}$ The square-root of the variance, σ, is known as the standard-deviation of the random variable

### Mean and Variance

the mean and variance of a random variable can be related by:

$\sigma ^{2}=\mu ^{2}+E[x^{2}]$ This is an important function, and we will use it later.

## Entropy

the entropy of a random variable $X$ is defined as:

$H[X]=E\left[{\frac {1}{p(X)}}\right]$ 