A projectile is shot from the edge of a cliff 100 m above ground with an initial speed of 50.0 m/s at an angle of 26° below the horizontal. How much time does it take the projectile to hit the ground?
Initial velocity of the projectile, V = 50.0 m/s
Angle of projection, = 26 deg, below the horizontal.
So, vertical component of the velocity, Vy = V*sin = 50.0*sin26 = 21.92 m/s
Height of the cliff, h = 100 m
Suppose, the projectile takes time 't' sec to hit the ground.
So, use the expression -
h = Vy*t + (1/2)*g*t^2
put the values -
100 = 21.92t + 0.5*9.8*t^2
=> 4.9t^2 + 21.92t - 100 = 0
So,
t = [-21.92 + sqrt(21.92^2 + 4*4.9*100)] / (2*4.9) = [-21.92 + 49.4] / 9.8 = 2.80 s
Other value of t can be discarded as this will be negative.
Therefore, the projectile will take 2.80 sec to hit the ground. (Answer)
Get Answers For Free
Most questions answered within 1 hours.