A new operating system is installed in every workstation at a
large company. The claim of the operating system manufacturer is
that the time to shut down and turn on the machine will be much
faster. To test it an employee selects 32 machines and tests the
combined shut down and restart time of each machine before and
after the new operating system has been installed. The mean of the
differences (beforeminusafter) is 23.4 seconds with a standard
deviation of 30 seconds. Complete parts a through d.
a) What is the standard error of the mean difference?
SE left parenthesis d overbar right parenthesisequals
5.30
(Round to two decimal places as needed.)
b) How many degrees of freedom does the t-statistic have?
dfequals
31
c) What is the 95% confidence interval for the mean
difference?
d) what do you conlclude at alpha .05 and .1
(C) t critical = T.INV.2T(alpha,df)
setting alpha = 1 -confidence level = 1-0.95 = 0.05
and df = 31
we get
t critical = 2.040
Confidence interval =
(pay attention to decimals while rounding off, I have rounded it off to 3 decimals)
(D) it is clear that confidence interval does not include 0, this means that there is a significant difference between the means at 0.05 as well as 0.01 significance level
Get Answers For Free
Most questions answered within 1 hours.