A research council wants to estimate the mean length of time (in minutes) that the average U.S. adult spends watching television using digital video recorders (DVR’s) each day. To determine the estimate, the research council takes random samples of 35 U.S. adults and obtains the following times in minutes.
24 |
27 |
26 |
29 |
33 |
21 |
18 |
24 |
23 |
34 |
17 |
15 |
19 |
23 |
25 |
29 |
36 |
19 |
18 |
22 |
16 |
45 |
32 |
12 |
24 |
35 |
14 |
40 |
30 |
19 |
14 |
28 |
32 |
15 |
39 |
From past studies, the research council has found that the standard deviation time is 4.3 minutes and that the population of times is normally distributed.
Construct a 90% confidence interval for the population mean.
Construct a 99% confidence interval for the population mean.
Interpret the results and compare the widths of the confidence intervals.
Test the claim that the mean time spent watching DVR’s is 20 minutes each day using a significance level of 0.05.
Get Answers For Free
Most questions answered within 1 hours.