A research council wants to estimate the mean length of time (in minutes) that the average U.S. adult spends watching television using digital video recorders (DVR’s) each day. To determine the estimate, the research council takes random samples of 35 U.S. adults and obtains the following times in minutes.
24 |
27 |
26 |
29 |
33 |
21 |
18 |
24 |
23 |
34 |
17 |
15 |
19 |
23 |
25 |
29 |
36 |
19 |
18 |
22 |
16 |
45 |
32 |
12 |
24 |
35 |
14 |
40 |
30 |
19 |
14 |
28 |
32 |
15 |
39 |
From past studies, the research council has found that the standard deviation time is 4.3 minutes and that the population of times is normally distributed.
Construct a 90% confidence interval for the population mean.
Construct a 99% confidence interval for the population mean.
Interpret the results and compare the widths of the confidence intervals.
Test the claim that the mean time spent watching DVR’s is 20 minutes each day using a significance level of 0.05.
You may use a TI-84 calculator or any software you prefer to find the confidence intervals.
The statistical software output for this problem is :
The 90% confidence interval is : (21.52 , 26.53)
The 99% confidence interval is : (19.99 , 28.07)
Fail to reject the null hypothesis .
Claim is not true
Get Answers For Free
Most questions answered within 1 hours.