A baseball pitcher throws a ball horizontally at a speed of 30.2 m/s. A catcher is 19.0 m away from the pitcher. Find the magnitude, in meters, of the vertical distance that the ball drops as it moves from the pitcher to the catcher. Ignore air resistance.
To determine the vertical distance, we need to know the time the
ball is in the air. Use the ball’s horizontal velocity and the
distance in the following equation.
d = v * t
19 = 30.2 * t
t = 19 ÷ 30.2
This is approximately 0.629 seconds. Use this time in the following
equation to determine the vertical distance that the ball
drops.
d = vi*t +0.5* a * t2
vi is the ball’s initial vertical velocity. This is 0
m/s. a = 9.8 m/s^2
d = 0.5*9.8*(19/30.2)2
d= 1.939 meter.
Get Answers For Free
Most questions answered within 1 hours.