I answered this question, apparently incorrect. My professor responded with "There is no SPSS output here; be sure to use the one-sample t-test walkthrough and to state what your conclusion is from the output. The formula for minimum sample size from the week 7 notes is n >= (t* s / M) ^ 2, so try to use that to find the number of months needed"
Can someone please answer this with the above formula so I can get an idea of what he is looking for in the future???
Consider the data below of inches of rainfall per month for a region in the Northwestern United States:
Plains
April 25.3
May 17.1
June 18.9
July 17.3
August 16.8
Using SPSS, find a 90% confidence interval for the average rainfall per month in this region. Be sure to set the test value to 0 if you have no specific value in mind and SPSS will find the confidence interval for how far off your mean is from 0. Using the formula from the Week 7 notes and hand calculations, how many months’ worth of data would we need to look at to be sure that our margin of error is less than 3 ??.?
Please attach your Word file and, in a written analysis, explain what your conclusion is and why
90% CI for the average rainfall per month in this region is (15.6754,22.4846).
Thus in the long run, true mean rainfall is expected to lie between 15.6754 inches and 22.4846 inches in 90% cases.
The margin of error M is defined as
Here s= 3.57099, t4,.05=2.131847. Then to find n such that M<=3.
Now M<=3 implies n >= (t* s / M) ^ 2 or n>=(2.131847*3.57099/3)^2=6.439421 or n is approximately 7( to nearest integer)
Get Answers For Free
Most questions answered within 1 hours.