Signals and Systems/Probability Basics

From Wikibooks, open books for an open world
< Signals and Systems
Jump to: navigation, search

Probability[edit]

This section of the Signals and Systems book will be talking about probability, random signals, and noise. This book will not, however, attempt to teach the basics of probability, because there are dozens of resources (both on the internet at large, and on Wikipedia mathematics bookshelf) for probability and statistics. This book will assume a basic knowledge of probability, and will work to explain random phenomena in the context of an Electrical Engineering book on signals.

Random Variable[edit]

A random variable is a quantity whose value is not fixed but depends somehow on chance. Typically the value of a random variable may consist of a fixed part and a random component due to uncertainty or disturbance. Other types of random variables takes their values as a result of the outcome of a random experiment.

Random variables are usually denoted with a capital letter. For instance, a generic random variable that we will use often is X. The capital letter represents the random variable itself and the corresponding lower-case letter (in this case "x") will be used to denote the observed value of X. x is one particular value of the process X.

Mean[edit]

The mean or more precise the expected value of a random variable is the central value of the random value, or the average of the observed values in the long run. We denote the mean of a signal x as μx. We will discuss the precise definition of the mean in the next chapter.

Standard Deviation[edit]

The standard deviation of a signal x, denoted by the symbol σx serves as a measure of how much deviation from the mean the signal demonstrates. For instance, if the standard deviation is small, most values of x are close to the mean. If the standard deviation is large, the values are more spread out.

The standard deviation is an easy concept to understand, but in practice it's not a quantity that is easy to compute directly, nor is it useful in calculations. However, the standard deviation is related to a more useful quantity, the variance.

Variance[edit]

The variance is the square of the standard deviation and is more of theoretical importance. We denote the variance of a signal x as σx2. We will discuss the variance and how it is calculated in the next chapter.

Probability Function[edit]

The probability function P is the probability that a certain event will occur. It is calculated based on the probability density function and cumulative distribution function, described below.

We can use the P operator in a variety of ways:

P[\mbox{A coin is heads}] = \frac{1}{2}
P[\mbox{A dice shows a 3}] = \frac{1}{6}

Probability Density Function[edit]

The Probability Density Function (PDF) of a random variable is a description of the distribution of the values of the random variable. By integrating this function over a particular range, we can find the probability that the random variable takes on a value in that interval. The integral of this function over all possible values is 1.

We denote the density function of a signal x as fx. The probability of an event xi will occur is given as:

P[x_i] = f_x(x_i)

Cumulative Distribution Function[edit]

The Cumulative Distribution Function (CDF) of a random variable describes the probability of observing a value at or below a certain threshold. A CDF function will be nondecreasing with the properties that the value of the CDF at negative infinity is zero, and the value of the CDF at positive infinity is 1.

We denote the CDF of a function with a capital F. The CDF of a signal x will have the subscript Fx.

We can say that the probability of an event occurring less then or equal to xi is defined in terms of the CDF as:

P[x \le x_i] = F_x(x_i)

Likewise, we can define the probability that an event occurs that is greater then xi as:

P[x > x_i] = 1 - F_x(x_i)

Or, the probability that an event occurs that is greater then or equal to xi:

P[x \ge x_i] = 1 - F_x(x_i) + f_x(x_i)

Relation with PDF[edit]

The CDF and PDF are related to one another by a simple integral relation:

F_x(x) = \int_{-\infty}^x f_x(\tau)d\tau
f_x(x) = \frac{d}{dx}F_x(x)

Terminology[edit]

Several book sources refer to the CDF as the "Probability Distribution Function", with the acronym PDF. To avoid the ambiguity of having both the distribution function and the density function with the same acronym (PDF), some books will refer to the density function as "pdf" (lower case) and the distribution function as "PDF" upper case. To avoid this ambiguity, this book will refer to the distribution function as the CDF, and the density function as the PDF.