# Statistics/Probability

Probability is connected with some unpredictability. We know what outcomes may occur, but not exactly which one. The set of possible outcomes plays a basic role. We call it the *sample space* and indicate it by S. Elements of S are called *outcomes*. In rolling a dice the sample space is S = {1,2,3,4,5,6}. Not only do we speak of the outcomes, but also about *events*, sets of outcomes (or subsets of the sample space). E.g. in rolling a dice we can ask whether the outcome was an even number, which means asking after the event "even" = E = {2,4,6}. In simple situations with a finite number of outcomes, we assign to each outcome s (∈ S) its *probability* (of occurrence) p(s) (written with a small p), a number between 0 and 1. It is a quite simple function, called the probability function, with the only further property that the total of all the probabilities sum up to 1. Also for events A do we speak of their probability P(A) (written with a capital P), which is simply the total of the probabilities of the outcomes in A. For a fair dice p(s) = 1/6 for each outcome s and P("even") = P(E) = 1/6+1/6+1/6 = 1/2.

The general concept of probability for non-finite sample spaces is a little more complex, although it rests on the same ideas.

## Introduction

[edit | edit source]### Why have probability in a statistics textbook?

[edit | edit source]Very little in mathematics is truly self contained. Many branches of mathematics touch and interact with one another, and the fields of probability and statistics are no different. A basic understanding of probability is vital in grasping basic statistics, and probability is largely abstract without statistics to determine the "real world" probabilities.

This section is not meant to give a comprehensive lecture in probability, but rather simply touch on the basics that are needed for this class, covering the basics of Bayesian Analysis for those students who are looking for something a little more interesting. This knowledge will be invaluable in attempting to understand the mathematics involved in various Distributions that come later.

### Set notion

[edit | edit source]A set is a collection of objects. We usually use capital letters to denote sets, e.g. A is the set of females in this room.

- The members of a set A are called the elements of A, e.g. Patricia is an element of A (Patricia ∈ A); Patrick is not an element of A (Patrick ∉ A).
- The universal set, U, is the set of all objects under consideration, e.g., U is the set of all people in this room.
- The null set or empty set, ∅, has no elements, e.g., the set of males above 2.8m tall in this room is an empty set.
- The complement A
^{c}of a set A is the set of elements in U outside A, i.e. x ∈ A^{c}iff x ∉ A. - Let A and B be 2 sets. A is a subset of B if each element of A is also an element of B. Write A ⊂ B, e.g. the set of females wearing metal frame glasses in this room ⊂ the set of females wearing glasses in this room ⊂ the set of females in this room.

• The intersection A ∩ B of two sets A and B is the set of the common elements. I.e. x ∈ A ∩ B iff x ∈ A and x ∈ B.

• The union A ∪ B of two sets A and B is the set of all elements from A or B. I.e. x ∈ A ∪ B iff x ∈ A or x ∈ B.

### Venn diagrams and notation

[edit | edit source]A Venn diagram visually models defined events. Each event is expressed with a circle. Events that have outcomes in common will overlap with what is known as the intersection of the events.

## Probability Axioms

[edit | edit source]## Calculating Probability

[edit | edit source]### Negation

[edit | edit source]Negation is a way of saying "not *A*", hence saying that the complement of A has occurred. *Note: The complement of an event A can be expressed as A' or A ^{c}*

For example: "What is the probability that a six-sided die will

**not**land on a one?" (five out of six, or

*p*= 0.833)

Or, more colloquially, "the probability of 'not X' together with the probability of 'X' equals one or 100%."

Relative frequency describes the number of successes over the total number of outcomes. For example if a coin is flipped and out of 50 flips 29 are heads then the relative frequency is

The Union of two events is when you want to know Event A OR Event B.

This is different from "And." "And" is the intersection, whereas "OR" is the union of the events (both events put together).

In the above example of events you will notice that...

Event A is a STAR and a DIAMOND.

Event B is a TRIANGLE and a PENTAGON and a STAR

(A ∩ B) = (A and B) = A intersect B is only the STAR

But (A ∪ B) = (A or B) = A Union B is EVERYTHING. The TRIANGLE, PENTAGON, STAR, and DIAMOND

Notice that both event A and Event B have the STAR in common. However, when you list the Union of the events you only list the STAR one time!

Event A = STAR, DIAMOND EVENT B = TRIANGLE, PENTAGON, STAR

When you combine them together you get (STAR + DIAMOND) + (TRIANGLE + PENTAGON + STAR) BUT WAIT!!! STAR is listed two times, so one will need to SUBTRACT the extra STAR from the list.

You should notice that it is the INTERSECTION that is listed TWICE, so you have to subtract the duplicate intersection.

### Conjunction

[edit | edit source]**Formula for the Union of Events: P(A ∪ B) = P(A) + P(B) - P(A ∩ B)**

Example:

Let P(A) = 0.3 and P(B) = 0.2 and P(A ∩ B) = 0.15. Find P(A ∪ B).

P(A ∪ B) = (0.3) + (0.2) - (0.15) = 0.35

Example:

Let P(A) = 0.3 and P(B) = 0.2 and P(A ∩ B) = 0. Find P(A ∪ B).

Note: Since the intersection of the events is the null set, then you know the events are DISJOINT or MUTUALLY EXCLUSIVE.

P(A ∪ B) = (0.3) + (0.2) - (0) = 0.5

### Disjunction

[edit | edit source]### Law of total probability

[edit | edit source]The law of total probability is[1] a theorem that, in its discrete case, states if {\displaystyle \left\{{B_{n}:n=1,2,3,\ldots }\right\}}\left\{{B_{n}:n=1,2,3,\ldots }\right\} is a finite or countably infinite partition of a sample space (in other words, a set of pairwise disjoint events whose union is the entire sample space) and each event {\displaystyle B_{n}}B_{n} is measurable, then for any event {\displaystyle A}A of the same probability space:

{\displaystyle P(A)=\sum _{n}P(A\cap B_{n})}{\displaystyle P(A)=\sum _{n}P(A\cap B_{n})} or, alternatively,[1]

{\displaystyle P(A)=\sum _{n}P(A\mid B_{n})P(B_{n}),}{\displaystyle P(A)=\sum _{n}P(A\mid B_{n})P(B_{n}),} where, for any {\displaystyle n}n for which {\displaystyle P(B_{n})=0}{\displaystyle P(B_{n})=0} these terms are simply omitted from the summation, because {\displaystyle P(A\mid B_{n})}{\displaystyle P(A\mid B_{n})} is finite.

## Conditional Probability

[edit | edit source]What is the probability of one event given that another event occurs? For example, what is the probability of a mouse finding the end of the maze, given that it finds the room before the end of the maze?

This is represented as:

or "the probability of *A* given *B*."

If *A* and *B* are **independent** of one another, such as with coin tosses or child births, then:

Thus, "what is the probability that the next child a family bears will be a boy, given that the last child is a boy."

This can also be stacked where the probability of *A* with several "givens."

or "the probability of *A* given that B_{1}, B_{2}, and B_{3} are true?"

-->