Suppose an online television and movie streaming service has determined that viewers who watched the entire first season of one of its original series take an average of 5.7801 days, with a standard deviation of 4.0205 days. One of their analysts wants to determine at a significance level of 0.05 whether new customers who begin watching the series within a week of the start of their subscription finish watching the season more quickly. She selects a simple random sample of new customers who began watching the first season of the series during the first week of their subscription and calculates summary statistics, where ? is the mean amount of time it takes for new customers to watch the first season of the series. Test of ?=5.7801 vs ?<5.7801The assumed standard deviation=4.0205 Sample size Sample mean Standard error ? ?⎯⎯⎯ SE 1550 5.5797 0.10212 Complete the analysis by calculating the value of the one-sample ?- statistic, the ?- value, and making the decision. First, calculate ? to at least two decimal places. Use software or a table of standard normal critical values to determine the ? ‑value. Give your answer precise to at least four decimal places. ?= ?=
The test statistic z = ()/()
= (5.5797 - 5.7801)/(4.0205/)
= -1.96
P-value = P(Z < -1.96)
= 0.0250
Get Answers For Free
Most questions answered within 1 hours.