The American Time Use Survey measures the amount of time that people spend on various activities. A 2016 random sample of 2,190 households found that the average person watches 2.73 hours of TV per day (TRUE!!) with a standard deviation of 1.64 hours. Show your work.
a. Build a 90% confidence interval for average daily TV viewing.
b. Briefly explain what this confidence interval means.
sample size = n = 2190
average person watches 2.73 hours of TV per day i.e.
standard deviation is 1.64 hours i.e.
Step 1: Margin of error
z value for 2190 at 90% confidence interval
z value for 90% CI is 1.645 as P(-1.645<z<1.645)=0.90
Step 2: Confidence interval = Mean +/- Margin of error
90% Confidence interval for average daily viewing is 2.672 to 2.788 hours
(b) Confidence interval is the interval that we are 90% confident that the true unknown value of the population mean will lie in the given interval.
Get Answers For Free
Most questions answered within 1 hours.