In baseball, is there a linear correlation between batting average and home run percentage? Let x represent the batting average of a professional baseball player, and let y represent the player's home run percentage (number of home runs per 100 times at bat). A random sample of n = 7 professional baseball players gave the following information.
x | 0.253 | 0.263 | 0.286 | 0.263 | 0.268 | 0.339 | 0.299 |
y | 1.1 | 3.3 | 5.5 | 3.8 | 3.5 | 7.3 | 5.0 |
(a) Make a scatter diagram of the data.
Then visualize the line you think best fits the data.
(b) Use a calculator to verify that Σx = 1.971,
Σx2 = 0.560, Σy = 29.5,
Σy2 = 147.33 and Σxy = 8.626.
Compute r. (Round to 3 decimal places.)
As x increases, does the value of r imply that
y should tend to increase or decrease? Explain your
answer.
Given our value of r, y should tend to remain constant as x increases.Given our value of r, y should tend to increase as x increases. Given our value of r, y should tend to decrease as x increases.Given our value of r, we can not draw any conclusions for the behavior of y as x increases.
Solution:
First input given data in excel,
and follow the steps:
select data ----> Insert ------>Charts ------> Scatter Chart
b) Following are the steps to find correlation.
Given values are correct. Σx = 1.971, Σx2 = 0.560, Σy = 29.5, Σy2 = 147.33, Σxy = 8.626 and n = 7
Formula for correlation:
Just put the above values in above formula, we get
r = 0.915
Interpretation: Given our value of r, y should tend to increase as x increases.
Done
Get Answers For Free
Most questions answered within 1 hours.