Chapter 13. Probability

Probability

Conditional Probability

* If A and B are two events of the same sample space S, then the conditional probability of event A, given that even B has occurred, is given by
p ( A | B ) = \frac{P (AՈB)}{P (B)} , P (B) ≠ 0
* If A is an event associated with sample space S, then P ( S | A ) = P ( A | A ) = 1.
* If A and B are two events of the same sample space S, and E is an event of S
such that P ( E ) ≠ 0, then P (A Ո B) | E ) = P ( A | E ) + P ( B | E ) P ( AՈ B) | E).
* A and B are two events of the same sample space S, then P ( A' | B ) = 1 P ( A | B ).

Multiplication Rule of Probability

* The simultaneous occurrence of events E and F in sample space S is denoted by E Ո F or EF.
* The multiplication rule of probability for three events E and F and G in a
sample space: P (E Ո F Ո G) = P(E) . P ( F | E ) ) . P ( G | ( E Ո F)) = P(E). P ( F | E ) .
P ( G | EF )

Independent Events

* Events E and F are said to be independent events, of the probability of the occurrence of one event is not affected by the occurrence of the other event.
* Events E and F are said to be independent events, if P( E Ո F) = P(E) . P(F)
* If E and F are two independent events associated with some experiment, then:
E and F’ are independent.
E’ and F are independent.
E’ and F’ are independent.
* If E and F are said to be independent events associated with some experiment, then the probability of the occurrences of “ at least one of E and F” is given by
P( E Ս F ) = 1 P( E’) P( F’).

Bayes’ Theorem

* The set of events E_{1}  , E_{2}  , E_{3}  ,    ,……….., E_{n }  , represents a partition of sample space S,
if:
E_{i} Ո E_{j} = Ø, i ≠ j, i, j = 1,2,3………, n
E_{i} Ս E_{2} Ս E_{3}Ս…….. Ս E_{n} = S
P(E_{i}) > 0 ∀ i = 1, 2, 3,….., n
* Theorem of Total Probability: Let { E_{1} , E_{2} , ……. E_{n} } be a partition of sample
space S, and suppose that each of the events E_{1} , E_{2} , ……. E_{n} has non – zero
probability of occurrence. Let A e any event associated with S, Then
P(A) = P(E_{1} ) P ( A | E_{1} ) + P(E_{2}) P ( A | E_{2} ) + …….+ P(E_{n}) P ( A | E_{N} ) )
= \sum_{j=1}^{n}  P E_{j}  P ( A | E_{j} )

* Bayes’ Theorem: If E_{1}, E_{2},……….., E_{n} are n non – empty events that constitute a
partition of sample space S, i.e. E_{1}, E_{2},……….., E_{n} are pair – wise disjoint and E_{1}
Ս E_{2} Ս……. Ս E_{n}= S, and A is any event of non – zero probability, then

P ( E_{1} | A) = \frac{P ( E_{i}) P ( A | ( E_{i} ) }{ \sum_{i=1}^{n}P ( E_{i} )P ( A | ( E_{i}) } \forall i = 1 , 2 , 3 , . . . , n

In Bayes’ Theorem, events E_{1}, E_{2},……….., E_{n}are called hypotheses.
P(E_{i}) is called the priori probability of the hypothesis, E_{n}.
The conditional probability, P ( E_{1} | A), is called the posteriori probability of the hypothesis, E_{i}.

Random Variable

A random variable is a real valued function whose domain is the sample space of a random experiment.

Probability Distribution of a Random Variable

The probability distribution of random variable X is the system of numbers:

X X_{1} X_{2} X_{3} X_{n}
P (X) P_{1} P_{2} P_{3} P_{n}

 *Where
P_{1} > 0, \sum_{i=1}^{n}{P_1}  = 1, i = 1,2,3………, n

Mean of a Random Variable

X or x_{1} x_{1} x_{1} x_{1} x_{1}
P (X)  or p_{1} p_{1} p_{2} p_{3} p_{n}

Mean of random variable or Expectation of X = E(X) = μ = \sum_{i=1}^{n}{x_i} p_{i}

Variance of a Random Variable

 X  or x_{1} x_{1} x_{1} x_{1} x_{1}
P ( X) or p_{1} p_{1} p_{2} p_{3}  … p_{n}

Variance of X = Var (X) = \sigma _{x}^{2} = \sum_{i=0}^{n}{(X_i - μ)^2} p ( x_{i})

Var X = E(X²) − [ E (X) ]²

Where E(X) = \sum_{i=1}^{n}{(x_i } p ( x_{i} )  and E (X²) = \sum_{i=1}^{n}{X_i^2} p( x_{i} )

Bernoulli Trials

* Independent trials having only two outcomes, namely success and failure, are called Bernoulli trials.
* The trials of a random experiment are said to e Bernoulli trials, if they satisfy the following conditions.
The number of trials should be finite.
The trials should be independent.
Each trial must have exactly two outcomes, success and failure.
The probability of success remains the same in each trial.

Binomial Distribution

P(X=x) = n_x^c q^{n - x } p^{x}, where n is the number of Bernoulli trials; p and q are the probabilities of each success and failure, respectively.

Leave a Comment