A researches measure the lengths of movies and determines the mean length of a film is 115 minutes with a standard deviation of 11 minutes. What is the probability that a random chosen movie is greater than 120 minutes in length?
Solution: Given that mean = 115, sd = 11,
P(X > 120) = P((X-mean)/sd > (120-114)/11)
= P(Z > 0.5455)
= 1 - P(Z < 0.5455)
= 1 - 0.7088
= 0.2912
Z | 0.00 | 0.01 | 0.02 | 0.03 | 0.04 | 0.05 | 0.06 | 0.07 | 0.08 | 0.09 |
0.0 | 0.5 | 0.504 | 0.508 | 0.512 | 0.516 | 0.5199 | 0.5239 | 0.5279 | 0.5319 | 0.5359 |
0.1 | 0.5398 | 0.5438 | 0.5478 | 0.5517 | 0.5557 | 0.5596 | 0.5636 | 0.5675 | 0.5714 | 0.5753 |
0.2 | 0.5793 | 0.5832 | 0.5871 | 0.591 | 0.5948 | 0.5987 | 0.6026 | 0.6064 | 0.6103 | 0.6141 |
0.3 | 0.6179 | 0.6217 | 0.6255 | 0.6293 | 0.6331 | 0.6368 | 0.6406 | 0.6443 | 0.648 | 0.6517 |
0.4 | 0.6554 | 0.6591 | 0.6628 | 0.6664 | 0.67 | 0.6736 | 0.6772 | 0.6808 | 0.6844 | 0.6879 |
0.5 | 0.6915 | 0.695 | 0.6985 | 0.7019 | 0.7054 | 0.7088 | 0.7123 | 0.7157 | 0.719 | 0.7224 |
Get Answers For Free
Most questions answered within 1 hours.