In baseball, is there a linear correlation between batting average and home run percentage? Let x represent the batting average of a professional baseball player, and let y represent the player's home run percentage (number of home runs per 100 times at bat). A random sample of n = 7 professional baseball players gave the following information.
x | 0.235 | 0.249 | 0.286 | 0.263 | 0.268 | 0.339 | 0.299 |
y | 1.3 | 3.2 | 5.5 | 3.8 | 3.5 | 7.3 | 5.0 |
(a) Make a scatter diagram of the data.
Then visualize the line you think best fits the data.
(b) Use a calculator to verify that Σx = 1.939,
Σx2 = 0.544, Σy = 29.6,
Σy2 = 147.16 and Σxy = 8.582.
Compute r. (Round to 3 decimal places.)
As x increases, does the value of r imply that
y should tend to increase or decrease? Explain your
answer.
Given our value of r, we can not draw any conclusions for the behavior of y as x increases.
Given our value of r, y should tend to remain constant as x increases.
Given our value of r, y should tend to increase as x increases.
Given our value of r, y should tend to decrease as x increases.
a)
b)
Given our value of r, y should tend to
increase as x increases.
Please hit thumps up if the answer helped you
Get Answers For Free
Most questions answered within 1 hours.