Observations are made on the time required to check out customers at a supermarket. From a sample of 36, it takes aisle M an average of six minutes per customer with a standard deviation of three minutes. In the same sample of 36, it takes aisle J an average of eight minutes with a standard deviation of 5 minus. Is the difference in average ties due solely to chance, with a probability of .05 of picking samples with extreme values?
Get Answers For Free
Most questions answered within 1 hours.