Suppose that a time study analyst takes a sample of size 60 observations. The process mean is 100 seconds, and the standard deviation is 3 seconds. If the max error can be no more than ± 1 second, how confident can the analyst be in the results?
*text,no pic, ty
Mean = 100
Std Dev = 3
Sample Size = 60
Lower Point = 100-1 = 99
Upper Point = 100+1 = 101
Z value at Lower Point = (Lower Point-Mean)/(Std Dev/sqrt(Sample Size)) = (99-100)/(3/sqrt(60)) = -2.5812
Z value at Upper Point = (Upper Point-Mean)/(Std Dev/sqrt(Sample Size)) = (101-100)/(3/sqrt(60)) = 2.5812
Probability at Z = -2.5812 is 0.0049
Probability at Z = 2.5812 is 0.9951
Confidence of Analyst = Confidence Interval = (Probability at Z = 2.5812) - (Probability at Z = -2.5812) = 0.9951-0.0049 = 0.9902 = 99.02%
Please Like & Provide your reviews in comments. :-)
Get Answers For Free
Most questions answered within 1 hours.