A person took a trip of 224 miles by car in 3.5 hours, from North Carolina to D. C., resulting in an average speed of 64 miles/hr. Prove that his car speed meter must register 64 miles/hr at some point of time during the trip. (you may use Intermediate Value Theorem or Prove your point by logical arguments)
Get Answers For Free
Most questions answered within 1 hours.