Week 11 - Probability
When we talk about random variables(RVs) in probability, they come in two types:
For discrete random variables we use a probability mass function (PMF) written as: \[ f_X(x) = Pr(X = x) \]
This means: "What is the probability that the random variable $X$ is exactly equal to the value $x$?"
Example: Tossing a fair die. 🎲 → $Pr(X=4)=\dfrac{1}{6}.$
To find the probability that $X$ lies in a range (like $2 ≤ X ≤ 4$), we add up the individual probabilities: \[ Pr(2\leq X \leq 4) = f_X(2) + f_X(3) + f_X(4). \]
For continuous random variables, we use a probability density function (PDF), denoted also by $f_X(x).$
Unlike discrete variables, we cannot calculate probabilities at exact points:
For all real numbers $a$: $\;Pr(X = a) = 0.$
Instead, we calculate probabilities over intervals using integration: \[ Pr(a\lt X\lt b) = \int_a^bf_X(x)\,dx \]
This integral represents the area under the curve of the PDF between $a$ and $b,$ which gives the probability that $X$ lies within that range.
Random Variable Type | Function Name | Notation | Probability for Range |
---|---|---|---|
Discrete | PMF (mass) | \( f_X(x) \) | \( \ds \Pr(X \in A) = \sum_{x \in A} f_X(x) \) |
Continuous | PDF (density) | \( f_X(x) \) | \( \ds\Pr(a < X < b) = \int_a^b f_X(x)\,dx \) |
The probability that any of one of the six sides of a fair die 🎲 lands uppermost when thrown is 1/6. This can be represented be represented mathematically as: \[ P(X = x) = \frac{1}{6}, \hspace{0.3 cm} x = 1, 2, \ldots, 6. \]
The cumulative density function (CDF) of a random variable is the probability that the random variable takes a value less than or equal to a specified value, $x$: \[ F(x) = Pr(X \leq x) \]
$X$ | 1 | 2 | 3 | 4 | 5 | 6 |
$f_X(x)$ | $\dfrac{1}{6}$ | $\dfrac{1}{6}$ | $\dfrac{1}{6}$ | $\dfrac{1}{6}$ | $\dfrac{1}{6}$ | $\dfrac{1}{6}$ |
$F(x)$ | $\dfrac{1}{6}$ | $\dfrac{2}{6}$ | $\dfrac{3}{6}$ | $\dfrac{4}{6}$ | $\dfrac{5}{6}$ | $\dfrac{6}{6}$ |
👆 Throwing a die 🎲
$F(x) = Pr(X \leq x)$
👆 Throwing a die 🎲 |
|
|
|
|
|
Let $X$ denote the random variable that measures the sum of the sides that come up when rolling two fair dice 🎲🎲 simultaneously.
|
Let $X$ denote the random variable that measures the sum of the sides that come up when rolling two fair dice 🎲🎲 simultaneously.
The continuous random variable $Y$ is uniformly distributed on the interval $(1,2).$
The continuous random variable $Y$ is uniformly distributed on the interval $(1,2).$
Suppose we have two events: $A$ and $B.$ The probability of both occurring simultaneously is
1. $P(A \text{ and } B) = P\left(A \,|\, B\right)P(B)$
2. $P(A \text{ and } B) = P\left(B \,|\, A\right)P(A)$
This implies $\,P\left(A \,|\, B\right)P(B) = P\left(B \,|\, A\right)P(A)$
⭐️ $P\left(A \,|\, B\right)= \dfrac{P\left(B \,|\, A\right)P(A)}{P(B)}$ ⭐️
⭐️ $P\left(A \,|\, B\right)= \dfrac{P\left(B \,|\, A\right)P(A)}{P(B)}$ ⭐️
Since $P(B) =$ $\, P\left(B \,|\, A\right)P(A) $ $\,+$ $\, P\left(B \,|\, A'\right)P(A'),$ then
$P\left(A \,|\, B\right)= \dfrac{P\left(B \,|\, A\right)P(A)}{P\left(B \,|\, A\right)P(A) + P\left(B \,|\, A'\right)P(A')}$
Example 👉 Medical Diagnosis
$ \large f_X(x; \mu, \sigma) =\ds \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{1}{2} \left( \frac{x - \mu}{\sigma} \right)^2}, $
$x \in \R,\, \mu \in \R, \sigma > 0,$
$ \large N\left(\mu, \sigma^2\right) =\ds \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{1}{2} \left( \frac{x - \mu}{\sigma} \right)^2}$
We write $X\sim N\left(\mu, \sigma^2\right)$:
$X$ is normally distributed with mean $\mu$ and variance $\sigma^2.$
![]() |
👈 |
$Z\sim N(0,1)$ $P(Z\gt a)$ |
Example 1: $Z \sim N(0,1).$ Find $P(Z\gt 1.52)$
Example 2: $Z \sim N(0,1).$ Find $P(0 \lt Z \lt 1.52)$
Exercise 1: Find $P( Z \lt -1.93)\qquad \;\,\,$
Exercise 2: Find $P( -1.52 \lt Z \lt 1.52)$
Many university students do some part-time work to supplement their allowances. In a study on students' wages earned from part-time work, it was found that their hourly wages are normally distributed with mean, $\mu = \$ 16.20$ and standard deviation $\sigma = \$3.$ Find the proportion of students who do part-time work and earn more than $25.00 per hour.
$P(X=k)$ $=$ $^nC_k $ $p^{\,k}$ $(1-p)^{n-k}$
Normal distribution: \[ N\left(\mu, \sigma^2\right) =\ds \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{1}{2} \left( \frac{x - \mu}{\sigma} \right)^2} \] |
![]() |
Poisson distribution: \[ P\left(X=k\right) =\ds \frac{e^{-\lambda }\lambda^k}{k!}, \;k = 0, 1, \ldots, \lambda \gt 0 \] | ![]() |
Exponential distribution: \[ f_X(x) =\ds \left\{ \begin{array}{rl} \lambda e^{-\lambda x}, & x\geq 0\\ 0, & \text{otherwise} \end{array} \right. \] | ![]() |
\[ P\left(X=k\right) =\ds \frac{e^{-\lambda }\lambda^k}{k!} \]
$X =\ds \sum_{k=1}^{n}X_i $ $\ds \sim \text{Poisson}\left(\sum_{k=1}^{n}\lambda\right)$
The manager of a bank branch supervises three tellers. The number of customers each teller can serve in a 10-minute period follows a Poisson distribution, but with different rates: Teller A serves customers at a rate of 2 per 10 minutes, Teller B at a rate of 3 per 10 minutes, and Teller C at a rate of 4 per 10 minutes.
Let \( X_A \sim \text{Poisson}(2) ,\) \( X_B \sim \text{Poisson}(3),\) and \( X_C \sim \text{Poisson}(4) ,\)
All independent.
The total number of customers served is:
$ X = X_A + X_B + X_C $ $\ds \sim \text{Poisson}(2 + 3 + 4) = \text{Poisson}(9)$
\( X_A \sim \text{Poisson}(2) ,\)
\( X_B \sim \text{Poisson}(3),\)
\( X_C \sim \text{Poisson}(4) ,\) independent.
The expected value of a Poisson distribution
is its rate parameter \( \lambda \).
Thus
$ E(X) $ $= 2 + 3 + 4 =9$
We want to compute \( P(X > 5) \)
where \( X \sim \text{Poisson}(9) .\)
Use the complement rule:
$ \quad P(X > 5) = 1 - P(X \leq 5) $ $= 1 - \ds \sum_{k=0}^{5}P\left(X=k\right) $
$ \qquad \qquad = 1 - \left[ \ds \frac{e^{-9}9^{0}}{0!} + \frac{e^{-9}9^{1}}{1!} + \frac{e^{-9}9^{2}}{2!} + \frac{e^{-9}9^{3}}{3!} + \frac{e^{-9}9^{4}}{4!} + \frac{e^{-9}9^{5}}{5!}\right] $
\[ f_{X}(x)=\ds \left\{ \begin{array}{rl} \lambda e^{-\lambda x}, & x\geq 0\\ 0, & \text{otherwise} \end{array} \right. \]
$\quad P(X\lt x)=$ $\ds F_X(x) =\ds \left\{ \begin{array}{rl} 1-e^{-\lambda x}, & x\geq 0\\ 0, & \text{otherwise} \end{array} \right.$