In an article in the Journal of Management, Joseph Martocchio studied and estimated the costs of employee absences. Based on a sample of 176 blue-collar workers, Martocchio estimated that the mean amount of paid time lost during a three-month period was 1.4 days per employee with a standard deviation of 1.4 days. Martocchio also estimated that the mean amount of unpaid time lost during a three-month period was 1.4 day per employee with a standard deviation of 1.6 days. Suppose we randomly select a sample of 100 blue-collar workers. Based on Martocchio’s estimates:
(a) What is the probability that the average amount of paid time lost during a three-month period for the 100 blue-collar workers will exceed 1.5 days? (Use the rounded mean and standard error to compute the rounded Z-score used to find the probability. Round means to 1 decimal place, standard deviations to 2 decimal places, and probabilities to 4 decimal places. Round z-value to 2 decimal places.) μx¯=μ σ P(x¯ >1.5)
(b) What is the probability that the average amount of unpaid time lost during a three-month period for the 100 blue-collar workers will exceed 1.5 days? (Use the rounded mean and standard error to compute the rounded Z-score used to find the probability. Round standard deviations to 2 decimal places and probabilities to 4 decimal places. Round z-value to 2 decimal places.) μx¯=μ σ P(x¯ > 1.5)
(c) Suppose we randomly select a sample of 100 blue-collar workers, and suppose the sample mean amount of unpaid time lost during a three-month period actually exceeds 1.5 days. Would it be reasonable to conclude that the mean amount of unpaid time lost has increased above the previously estimated 1.4 days?
please like if it helps me please please
Thank you so much
Get Answers For Free
Most questions answered within 1 hours.