Suppose you implemented a quadratic time (that is an O(n^2) algorithm for a problem P. On a test run, your algorithm takes 50 seconds on inputs of size 1000. Your friend found a clever algorithm which solves the same problem with a running time O(n^(3/2)). However, the faster algorithm takes 150 seconds on inputs of size 1000. How could this be? If you need to solve a problem of size 4000, which algorithm you should use? What about inputs of size 10,000? Explain your answers (assume low-order terms are negligible).
Say the quadratic algorithm takes time
and the other one takes
the constants can vary so the time may be more for the lower-order algorithm.
So we can see that
which gives b = 0.0047
and c = 0.00005
Hence for the size of 4000 we have
Quadratic algo taking the time of 800 sec
other one takes time of 0.0047*(4000^(1.5)) = 1,189.016
For the size of 4000 quadratic algo is preferred as it takes lesser time
Now for the size of 10000
Solving the same equations, we get the time for quadratic algo is 0.00005 * (10000^2) = 5000 sec
and for the other algo is 0.0047*(10000^(1.5)) = 4700 sec
So for the size of 10000, it takes lesser time than quadratic algo hence it is preferred.
Get Answers For Free
Most questions answered within 1 hours.