PROBABILITY
1. SOME BASIC TERMS AND CONCEPTS
(a) An Experiment: An action or operation resulting in two or more outcomes is called an experiment.
(b) Sample Space : The set of all possible outcomes of an experiment is called the sample space, denoted by S. An element of \(S\) is called a sample point.
(c) Event : Any subset of sample space is an event.
(d) Simple Event : An event is called a simple event if it is a singleton subset of the sample space \(S\).
(e) Compound Events : It is the joint occurrence of two or more simple events.
(f) Equally Likely Events : A number of simple events are said to be equally likely if there is no reason for one event to occur in preference to any other event.
(g) Exhaustive Events : All the possible outcomes taken together in which an experiment can result are said to be exhaustive.
(h) Mutually Exclusive or Disjoint Events : If two events cannot occur simultaneously, then they are mutually exclusive.
If \(A\) and \(B\) are mutually exclusive, then \(A \cap B=\phi\).
(i) Complement of an Event : The complement of an event A, denoted by \(\overline{A}\), \(A'\) or \(A ^{ C }\), is the set of all sample points of the space other then the sample points in \(A\).
2. MATHEMATICAL DEFINITION OF PROBABILITY
Let the outcomes of an experiment consists of \(n\) exhaustive mutually exclusive and equally likely cases. Then the sample spaces \(S\) has \(n\) sample points. If an event A consists of \(m\) sample points, \((0 \leq m \leq n)\), then the probability of event A, denoted by P(A) is defined to be \(m/ n\) i.e. \(P (A)= m / n\)
Let \(S=a_{1}, a_{2}, \ldots \ldots, a_{n}\) be the sample space
(a) \(P ( S )=\frac{ n }{ n }=1\) corresponding to the certain event.
(b) \(P (\phi)=\frac{0}{ n }=0\) corresponding to the null event \(\phi\) or impossible event.
(c) If \(A_{i}=\left\{a_{i}\right\}, i=1, \ldots . ., n\) then \(A_{i}\) is the event corresponding to a single sample point \(a_{i} .\) Then \(P\left(A_{i}\right)=\frac{1}{n}\).
(d) \(0 \leq P ( A ) \leq 1\)
3. ODDS AGAINST AND ODDS IN FAVOUR OF AN EVENT :
Let there be \(m + n\) equally likely, mutually exclusive and exhaustive cases out of which an event A can occur in m cases and does not occur in \(n\) cases. Then by definition, probability of occurrences of event \(A=P(A)=\dfrac{m}{m+n}\)
The probability of non-occurrence of event \(A=P\left(A^{\prime}\right)=\dfrac{n}{m+n}\)
\(\therefore P ( A ): P \left( A ^{\prime}\right)= m : n\)
Thus the odd in favour of occurrences of the event \(A\) are defined by \(m : n\) i.e. \(P ( A ): P \left( A ^{\prime}\right) ;\) and the odds against the occurrence of the event \(A\) are defined by \(n:\) m i.e. \(P\left(A^{\prime}\right): P(A)\).
4. ADDITION THEOREM
(a) If \(A\) and \(B\) are any events in \(S\), then \(P ( A \cup B )= P ( A )+ P ( B )- P ( A \cap B )\)
Since the probability of an event is a nonnegative number, it follows that
\(P(A \cup B) \leq P(A)+P(B)\)
For three events \(A, B\) and \(C\) in \(S\) we have \(P(A \cup B \cup C)\)\(=P(A) +P(B)+P(C)\)\(-P(A \cap B)-\)\(P(B \cap C) \)\(-P(C \cap A)+\)\(P(A \cap B \cap C) \)
General form of addition theorem (Principle of Inclusion-Exclusion)
For \(n\) events \(A _{1}, A _{2}, A _{3}, \ldots \ldots A _{ n }\) in \(S\), we have
\(P \left( A _{1} \cup A _{2} \cup A _{3} \cup A _{4} \ldots \ldots \ldots \cup A _{ n }\right)\)
\(=\displaystyle \sum_{i=1}^{n} P ( A_i)-\)\(\displaystyle \sum_{i<j} P \left( A _{i} \cap A _{j}\right)+\)\(\displaystyle\sum_{i<j<k} P \left( A _{i} \cap A _{j} \cap A _{k}\right)+\ldots\)\(+(-1)^{ n -1} P \left( A _{1} \cap A _{2} \cap A _{3} \ldots \ldots \cap A _{ n }\right)\)
(b) If \(A\) and \(B\) are mutually exclusive, then \(P(A \cap B)=0\) so that \(P ( A \cup B )= P ( A )+ P ( B )\)
5. CONDITIONAL PROBABILITY :
If \(A\) and \(B\) are any events in \(S\) then the conditional probability of B relative to A, i.e. probability of occurence of \(B\) when A has occured, is given by
\(P ( B / A )=\dfrac{ P ( B \cap A )}{ P ( A )}\) . If \( P ( A ) \neq 0\)
6. MULTIPLICATION THEOREM
Independent event:
So if \(A\) and \(B\) are two independent events then happening of \(B\) will have no effect on \(A\).
(a) When events are independent:
\(P ( A / B )= P ( A )\) and \(P ( B / A )= P ( B )\), then
\(P(A \cap B)=P(A) . P(B)\) or \(P(A B)=P(A) \cdot P(B)\)
(b) When events are not independent
The probability of simultaneous happening of two events \(A\) and \(B\) is equal to the probability of A multiplied by the conditional probability of \(B\) with respect to A (or probability of B multiplied by the conditional probability of A with respect to B) i.e
\(P(A \cap B)=P(A) \cdot P(B / A)\) or \(P(B) \cdot P(A / B)\) OR
\(P(A B)=P(A) \cdot P(B / A) \text { or } P(B) . P(A / B)\)
(c) Probability of at least one of the n Independent events
If \(p _{1}, p _{2}, p _{3}, \ldots \ldots p _{ n }\) are the probabilities of \(n\) independent events \(A _{1}, A _{2}, A _{3} \ldots \ldots A _{ n }\) then the probability of happening of at least one of these event is
\(1-\left[\left(1- p _{1}\right)\left(1- p _{2}\right) \ldots \ldots\left(1- p _{ n }\right)\right]\)
\(\Rightarrow P \left( A _{1}+ A _{2}+ A _{3}+\ldots+ A _{ n }\right)\)\(=1- P \left(\overline{ A }_{1}\right) P \left(\overline{ A }_{2}\right) P \left(\overline{ A }_{3}\right) \ldots\)\(\ldots P \left(\overline{ A }_{ n }\right)\)
7. TOTAL PROBABILITY THEOREM :
Let an event \(A\) of an experiment occurs with its \(n\) mutually exclusive & exhaustive events \(B _{1}, B _{2}, B _{3}, \ldots \ldots . B _{ n }\) then total probability of occurence of even A is
\(P ( A )\)\(= P \left( AB _{1}\right)+ P \left( AB _{2}\right)+\ldots\)\( \ldots+ P \left( AB _{ n }\right)\)\(=\displaystyle \sum_{ i =1}^{ n } P \left( AB _{ i }\right)\)
\(\Rightarrow P ( A )\)\(= P \left( B _{1}\right) P \left( A \mid B _{1}\right)+ P \left( B _{2}\right) P \left( A \mid B _{2}\right)+\ldots\)\(\ldots+ P \left( B _{ n }\right) P \left( A \mid B _{ n }\right)\)\(=\sum P \left( B _{ i }\right) P \left( A / B _{ i }\right)\)
8. BAYE'S THEOREM OR INVERSE PBOBABILITY :
Let \(A_{1}, A_{2,} \ldots . ., A_{n}\) be \(n\) mutually exclusive and exhaustive events of the sample space \(S\) and \(A\) is event which can occur with any of the events then \(P \left(\dfrac{ A _{ i }}{ A }\right)=\dfrac{ P \left( A _{ i }\right) P \left( A / A _{ i }\right)}{\displaystyle \sum_{ i =1}^{ n } P \left( A _{ i }\right) P \left( A / A _{ i }\right)}\)
9. BINOMIAL DISTRIBUTION FOR REPEATED TRIALS
Binomial Experiment : Any experiment which has only two outcomes is known as binomial experiment.
Outcomes of such an experiment are known as success and failure.
Probability of success is denoted by \(p\) and probability of failure by \(q\)
\(\therefore p + q =1\)
If binomial experiment is repeated \(n\) times, then
(a) Probability of exactly \(r\) successes in \(n\) trials \(={ }^{ n } C _{ r } p ^{ r } q ^{ n - r }\)
(b) Probability of at most \(r\) successes in \(n\) trails \(=\displaystyle \sum_{\lambda=0}^{ r }{ }^{n} C_{\lambda} p^{\lambda} q^{n-\lambda}\)
(c) Probability of atleast \(r\) successes in \(n\) trails \(=\displaystyle \sum_{\lambda=r}^{n}{ }^{n} C_{\lambda} p^{\lambda} q^{n-\lambda}\)
(d) Probability of having \(I ^{ st }\) success at the \(r ^{ th }\) trials \(= p q ^{ r -1}\). The mean, the variance and the standard deviation of binomial distribution are \(np , npq , \sqrt{ npq }\).
Note : \((p+q)^{n}=\)\({ }^{n} C_{0} q^{n}+{ }^{n} C_{1} p q^{n-1}+{ }^{n} C_{2} p^{2} q^{n-2}+\ldots\)\(...+{ }^{n} C_{r} p^{r} q^{n-r}+\ldots\)\( . \ldots+{ }^{n} C_{n} p^{n}=1 \)
10. SOME IMPORTANT RESULTS
(a) Let \(A\) and \(B\) be two events, then
(i) \( P(A)+P(\bar{A})=1\)
(ii) \(P ( A + B )=1- P (\bar{ A } \bar{ B })\)
(iii) \(P(A / B)=\dfrac{P(A B)}{P(B)}\)
(iv) \(P ( A + B )= P ( AB )+ P (\overline{ A } B )+ P ( A \overline{ B })\)
(v) \(A \subset B \Rightarrow P(A) \leq P(B)\)
(vi) \(P(\bar{A} B)=P(B)-P(A B)\)
(vii) \(P(A B) \leq P(A) P(B) \leq P(A+B) \leq P(A)+P(B)\)
(viii) \(P ( AB )= P ( A )+ P ( B )- P ( A + B )\)
(ix) \(P (\) Exactly one event \()= P ( A \overline{ B })+ P (\overline{ A } B )\)
\(= P ( A )+ P ( B )-2 P ( AB )= P ( A + B )- P ( AB )\)
(x) \(P\) (neither A nor \(B)=P(\bar{A} \bar{B})=1-P(A+B)\)
(xi) \(P(\bar{A}+\bar{B})=1-P(A B)\)
(b) Number of exhaustive cases of tossing \(n\) coins simultaneously (or of tossing a coin \(n\) times \()=2^{ n }\)
(c) Number of exhaustive cases of throwing \(n\) dice simultaneously (or throwing one dice \(n\) times \()=6^{ n }\)
(d) Playing Cards :
(i) Total Cards: 52(26 red, 26 black)
(ii) Four suits : Heart, Diamond, Spade, Club - 13 cards each
(iii) Court Cards : 12 (4 Kings, 4 queens, 4 jacks)
(iv) Honour Cards: 16 (4 aces, 4 kings, 4 queens, 4 jacks)
(e) Probability regarding n letters and their envelopes:
If n letters are placed into n directed envelopes at random, then
(i) Probability that all letters are in right envelopes \(=\dfrac{1}{n !}\).
(ii) Probability that all letters are not in right envelopes \(=1-\dfrac{1}{n !}\)
(iii) Probability that no letters is in right envelopes
\(=\dfrac{1}{2 !}-\dfrac{1}{3 !}+\dfrac{1}{4 !}-\ldots . .+(-1)^{ n } \dfrac{1}{ n !}\)
(iv) Probability that exactly \(r\) letters are in right envelopes \(=\dfrac{1}{r !}\left[\frac{1}{2 !}-\frac{1}{3 !}+\frac{1}{4 !}-\ldots . .+(-1)^{n-r} \frac{1}{(n-r) !}\right]\)
11. PROBABILITY DISTRIBUTION :
(a) A Probability Distribution spells out how a total probability of 1 is distributed over several values of a random variable.
(b) Mean of any probability distribution of a random variable is given by:
\(\mu=\frac{\sum p _{ i } x _{ i }}{\sum p _{ i }}=\sum p _{ i } x _{ i }\left(\right.\) Since \(\left.\Sigma p _{ i }=1\right)\)
(c) Variance of a random variable is given by, \(\sigma^{2}=\sum\left( x _{ i }-\mu\right)^{2} \cdot p _{ i }\)
\(\sigma^{2}=\sum p _{ i } x _{ i }^{2}-\mu^{2}\) (Note that Standard Deviation \(( SD )=+\sqrt{\sigma^{2}}\)
(d) The probability distribution for a binomial variate ' \(X\) ' is given by \( P ( X = r )\)\(={ }^{ n } C _{ r } p ^{ r } q ^{ n - r } \) where: \(p =\) probability of success in a single trial, \(q =\) probability of failure in a single trial and \(p + q =1\).
(e) Mean of Binomial Probability Distribution (BPD) \(= np\); variance of \(BPD = npq\)
(f) If p represents a person's chance of success in any venture and 'M' the sum of money which he will receive in case of success, then his expectations or probable value \(= pM\)
Comments
Post a Comment