A weather forecaster predicts that the May rainfall in a local area will be between 3 and 4 inches but has no idea where within the interval the amount will be. Let x be the amount of May rainfall in the local area, and assume that x is uniformly distributed over the interval 3 to 4 inches.
What is the probability that the observed May rainfall will fall within two standard deviations of the mean? Within one standard deviation of the mean? (Round all intermediate and final answers to 4 decimal places.)
here a= 3
b= 4
µ = mean = (a+b)/2 = (3+4)/2= 3.5
variance = (b-a)²/12 =
(4-3)²/12= 0.083
σ = std dev = √ variance =
0.288675135
a) P(µ±2σ) = P ( 2.9226 ≤ X ≤ 4.0774 )
=(x2-x1)/(b-a) = (4.07735-2.9226)/(4-3)=
1.1547
b) P(µ±1σ) =P ( 3.2113 ≤ X ≤ 3.7887 )
=(x2-x1)/(b-a) = (3.7887-3.2113)/(4-3)=
0.5774
please revert for doubts and |
please UPVOTE the solution |
Get Answers For Free
Most questions answered within 1 hours.