A company conducts surveys each year about the use of various
media, such as television and video viewing. A representative
sample of adult Americans was surveyed in 2010, and the mean number
of minutes per week spent watching "time-shifted" television
(watching television shows that were recorded and played back at a
later time) for the people in this sample was 572 minutes. An
independently selected representative sample of adults was surveyed
in 2011, and the mean time spent watching time-shifted television
per week for this sample was 643 minutes. Suppose that the sample
size in each year was 1000 and that the sample standard deviations
were 60 minutes for the 2010 sample and 80 minutes for the 2011
sample. Estimate the difference in the mean time spent watching
time-shifted television in 2010 and the mean time spent in 2011
using a 99% confidence interval. (Use μ2010 −
μ2011. Round your answers to two decimal
places.)
to minutes
Interpret the interval in context.
We are 99% confident that the true mean time spent watching television in 2011 is between these two values.We are 99% confident that the true mean time spent watching television in 2010 is between these two values. There is a 99% chance that the true mean time spent watching television in 2010 is directly in the middle of these two values.We are 99% confident that the true difference in mean time spent watching television in 2010 and 2011 is between these two values.There is a 99% chance that the true difference in mean time spent watching television in 2010 and 2011 is directly in the middle of these two values.
Get Answers For Free
Most questions answered within 1 hours.