# Stat 202: Lecture 11 (covers pp. 138-155)

Nathan VanHoudnos
10/17/2014

### Agenda

2. Checkpoint #12 results
3. Lecture 11 (covers pp. 138-155)

### Agenda

2. Checkpoint #12 result
3. Lecture 11 (covers pp. 128-155)

to fill in

### Question 2 Checkpoint #12

The random variable X, representing the number of accidents in a certain intersection in a week, has the following probability distribution:

x     |    0|    1|    2|    3|    4|    5|
-------------------------------------------
P(X=x)| 0.20| 0.30| 0.20| 0.15| 0.10| 0.05|


By the third day of a particular week, 2 accidents have already occurred in the intersection. What is the probability that there will be less than a total of 4 accidents during that week?

By the third day of a particular week, 2 accidents have already occurred in the intersection. What is the probability that there will be less than a total of 4 accidents during that week?

x     |    0|    1|    2|    3|    4|    5|
-------------------------------------------
P(X=x)| 0.20| 0.30| 0.20| 0.15| 0.10| 0.05|


“Two accident have already occurred” implies that at least two accidents will occur this week. We condition on $$X \ge 2$$.

The “less than a total of 4” implies $$X < 4$$

Therefore we want $$P(X < 4|X \ge 2)$$.

By definition $$P(X < 4|X \ge 2)$$ is

$P(X < 4|X \ge 2) = \frac{P( X < 4 \text{ and } X \ge 2 )}{P(X \ge 2)}$

The only counts less than 4 and greater than or equal to 2 are 2 and 3:

$P( X < 4 \text{ and } X \ge 2 ) = P(X =2 ) + P(X = 3)$

By the third day of a particular week, 2 accidents have already occurred in the intersection. What is the probability that there will be less than a total of 4 accidents during that week?

x     |    0|    1|    2|    3|    4|    5|
-------------------------------------------
P(X=x)| 0.20| 0.30| 0.20| 0.15| 0.10| 0.05|


\begin{aligned} P(X < 4|X \ge 2) & = \frac{P(X =2 ) + P(X = 3)}{P(X \ge 2)} \\ & = \frac{.2 + .15}{.20 + .15 +.10 +.05} \\ & = \frac{.35}{.5} = .70 \end{aligned}

### Agenda

2. Checkpoints #12 result
3. Lecture 11 (covers pp. 128-155)
• Checkpoint 13: Expection and variance rules (finish today, slides marked with R are review)
• Checkpoint 15: Normal random variables

### (R) Expected value rules

Let $$Y = X + 6$$ be a new random variable.

Is $$E[Y] = E[X] + 6$$?

### $$\text{(R) Let } Y = X + a \quad E[Y] = ?$$

\begin{aligned} E[Y] & = \sum y \cdot P(Y=y) \\ & = \sum (x + a) \cdot P(X + a = x + a) \\ & = \sum (x + a) \cdot P(X = x) \\ & = \left( \sum x \cdot P(X=x) \right ) + \left( \sum a \cdot P(X=x) \right) \\ & = E[X] + a \cdot \left(\sum P(X=x) \right) \\ & = E[X] + a \end{aligned}

### (R) Expected value rules

Let $$Y = 4X$$ be a new random variable.

Is $$E[Y] = 4 \cdot E[X]$$?

### $$\text{(R) Let } Y = b X \quad E[Y] = ?$$

\begin{aligned} E[Y] & = \sum y \cdot P(Y=y) \\ & = \sum (x b ) \cdot P(X b = x b) \\ & = \sum (x b) \cdot P(X = x) \\ & = b \cdot \left( \sum x \cdot P(X=x) \right ) \\ & = b \cdot E[X] \end{aligned}

### $$\text{(R) Let } Y = a + b X \quad E[Y] = ?$$

We can combine these rules:

\begin{aligned} E[Y] & = E[a + b X] \\ & = a + E[ b X] \\ & = a + b \cdot E[X] \end{aligned}

### An example

At StatsPie, the number of toppings on the typical pizza is a random variable X having a mean value of 3. The cost of a pizza is $10 plus$1.50 per topping. How much does the average customer spend on a pizza?

\begin{aligned} E[X] & = 3 & Y & = 10 + 1.5 X \end{aligned}

\begin{aligned} E[Y] & = E[10 + 1.5 X] \\ & = 10 + 1.5 \cdot E[X] \\ & = 10 + 1.5 \cdot 3 = 14.50 \end{aligned}

implying the average pizza costs 14.50. ### Variance Let the variance of the random variable $$X$$ be defined as $\text{Var}[X] = E\left[ \left(X - E[X]\right)^2 \right]$ i.e. the expected value of the squared distance from the mean. Using the rules for expected values we derive the following identity: $\text{Var}[X] = E[X^2] ~ - ~ \left( E[X] \right)^2$ which is easier to calculate by hand. ### Example \begin{aligned} \text{Var}[X] & = E[X^2] ~ - ~ \left( E[X] \right)^2 \\ % E[X] & = \sum_{x \in S} x \cdot P(X = x) \\ % E[X^2] & = \sum_{x \in S} x^2 \cdot P(X = x) \end{aligned} x | 0| 1| 2| 3| 4| 5| --------+----+----+----+----+----+----| P(X = x)| .28| .37| .23| .09| .02| .01|  \begin{aligned} E[X] & = \sum_{x \in S} x \cdot P(X = x) \\ & = ~ 0 \times .28 + 1 \times .37 + 2 \times .23 \\ & ~ + \! 3 \times .09 + 4 \times .02 + 5 \times .01 \\ & = 1.23 \end{aligned} ### Example \begin{aligned} \text{Var}[X] & = E[X^2] ~ - ~ \left( E[X] \right)^2 \\ % E[X] & = \sum_{x \in S} x \cdot P(X = x) \\ % E[X^2] & = \sum_{x \in S} x^2 \cdot P(X = x) \end{aligned} x | 0| 1| 2| 3| 4| 5| --------+----+----+----+----+----+----| P(X = x)| .28| .37| .23| .09| .02| .01|  \begin{aligned} E[X^2] & = \sum_{x \in S} x^2 \cdot P(X = x) \\ & = ~ 0^2 \times .28 + 1^2 \times .37 + 2^2 \times .23 \\ & ~ + \! 3^2 \times .09 + 4^2 \times .02 + 5^2 \times .01 \\ & = 2.67 \end{aligned} ### Example \begin{aligned} E[X] & = 1.23 & E[X^2] & = 2.67 \end{aligned} x | 0| 1| 2| 3| 4| 5| --------+----+----+----+----+----+----| P(X = x)| .28| .37| .23| .09| .02| .01|  \begin{aligned} \text{Var}[X] & = E[X^2] - \left( E[X] \right)^2 \\ & = 2.67 - (1.23)^2 \\ & = 1.16 \end{aligned} implying that college students change their majors on average 1.23 times with a variance of 1.16 “times-squared”. ### "Times-squared" is a weird unit Let the standard deviation be the square root of the variance. $\text{sd}[X] = \sqrt{\text{Var}[X]}$ Example: \begin{aligned} \text{sd}[X] & = \sqrt{\text{Var}[X]} \\ & = \sqrt{1.16} & = 1.078 \end{aligned} implying that college students change their majors on average 1.23 times with a standard deviation of 1.08 times. ### Rules for standard deviation Using the rules for expectations, we can show that: $\text{Var}[a + bX] = b^2 \cdot \text{Var}[X]$ Therefore $\text{sd}[a + bX] = b \cdot \text{sd}[X]$ ### Example At StatsPie, the number of toppings on the typical pizza has a mean value of 3 with a standard deviation of ½. The cost of a pizza is10 plus 1.50 per topping. What are the mean and standard deviation of the distribution of pizza cost? \begin{aligned} E[X] & = 3 & \text{sd}[X] & = 0.5 \\ Y & = 10 + 1.5 X \\ E[Y] & = ? & \text{sd}[Y] & = ? \end{aligned} ### Example \begin{aligned} E[X] & = 3 & \text{sd}[X] & = 0.5 \\ Y & = 10 + 1.5 X \\ \end{aligned} Recall that $$E[Y] = 14.50$$. Find the standard deviation: \begin{aligned} \text{sd}[Y] & = \text{sd}[10 + 1.5 X] \\ & = 1.5 \cdot \text{sd}[X] \\ & = 1.5 \cdot 0.5 = .75 \end{aligned} ### Example At StatsPie, the number of toppings on the typical pizza has a mean value of 3 with a standard deviation of ½. The cost of a pizza is10 plus 1.50 per topping. What are the mean and standard deviation of the distribution of pizza cost? \begin{aligned} E[Y] & = 14.50 & \text{sd}[Y] & = .75 \end{aligned} implying that the pizza typically costs an average of14.50 with a standard deviation of \$0.75.

Let $$X$$ and $$Y$$ be random variables and define $$Z = X + Y$$ and $$Q = X - Y$$.

It can be shown that, in all cases:

\begin{aligned} E[Z] & = E[X] + E[Y] \\ E[Q] & = E[X] - E[Y] \end{aligned}

And, if and only if X and Y are independent, that

\begin{aligned} \text{Var}[Z] & = \text{Var}[X] + \text{Var}[Y]\\ \text{Var}[Q] & = \text{Var}[X] + \text{Var}[Y] \end{aligned}

### Standard deviations do not add

If and only if X and Y are independent, then

\begin{aligned} \text{Var}[Z] & = \text{Var}[X] + \text{Var}[Y]\\ \text{Var}[Q] & = \text{Var}[X] + \text{Var}[Y] \end{aligned}

Further note that, iff X and Y are independent \begin{aligned} \text{sd}[Z] & = \sqrt{\text{Var}[X] + \text{Var}[Y]}\\ \text{sd}[Q] & = \sqrt{\text{Var}[X] + \text{Var}[Y]} \end{aligned}

### What if X and Y are dependent?

Not covered in Stat 202.

Let $$X$$ and $$Y$$ be random variables and define $$Z = aX + bY + c$$.

$\text{Var}[Z] = a^2 \cdot \text{Var}[X] + b^2 \cdot \text{Var}[Y] - 2ab \cdot \text{Cov}[X,Y]$

where

$\text{Cov}[X,Y] = E\big[ \left( X - E[X] \right) \cdot \left( Y - E[Y] \right) \big]$

### Agenda

2. Checkpoints #12 result
3. Lecture 11 (covers pp. 128-155)
• Checkpoint 13: Exception and variance rules
• Checkpoint 15: Normal random variables

IQ: 2 categories

IQ: 3 categories

### Transition to continuous RVs

and lots more…

and infinitely more…