Statistics/Distributions/Continuous

From Wikibooks, open books for an open world
< Statistics‎ | Distributions
Jump to: navigation, search

A continuous statistic is a random variable that does not have any points at which there is any distinct probability that the variable will be the corresponding number.

General Properties[edit]

Cumulative Distribution Function[edit]

A continuous random variable, like a discrete random variable, has a cumulative distribution function. Like the one for a discrete random variable, it also increases towards 1. Depending on the random variable, it may reach one at a finite number, or it may not. The cdf is represented by a capital F.

Probability Distribution Function[edit]

Unlike a discrete random variable, a continuous random variable has a probability density function instead of a probability mass function. The difference is that the former must integrate to 1, while the latter must have a total value of 1. The two are very similar, otherwise. The pdf is represented by a lowercase f.

Special Values[edit]

Let R be the set of points of the distribution.

The expected value for a continuous variable X with probability density function f is defined as \int_{R} xf(x)dx.

More generally, the expected value of any continuously transformed variable g(X) with probability density function f is defined as \int_{R} g(x)f(x)dx.

The mean of a continuous or discrete distribution is defined as E[X].

The variance of a continuous or discrete distribution is defined as E[(X-E[X]^2)].

Expectations can also be derived by producing the Moment Generating Function for the distribution in question. This is done by finding the expected value E[\exp(tX)]. Once the Moment Generating Function has been created, each derivative of the function gives a different piece of information about the distribution function.

\frac{d E[\exp(tX)]}{dt} = mean
\frac{d^2 E[\exp(tX)]}{dt^2} = variance
\frac{d^3 E[\exp(tX)]}{dt^3} = skewness
\frac{d^4 E[\exp(tX)]}{dt^4} = kurtosis