Consider the following scenario. You have two solutions to solve a given problem. Option A’s time efficiency is in O(n log n), while Option B’s time efficiency is in O(n2). You therefore hypothesize that Option A is always better than Option B. Yet when you test the two competing methods, you find that Option A is only faster if n is at least 100 (n >= 100). Option B is faster if n is less than 100 (n < 100). How is this possible?
`Hey,
Note: Brother if you have any queries related the answer please do comment. I would be very happy to resolve all your queries.
This is possible since T(n)=O(n*log(n)) means
T<=c1*n*log(n)
For T(n)=O(n^2)
So,
T<=c2*n^2
It can happen since for very small value of n c1*n*log(n)>c2*n^2 after which both becomes equal and then c1*n*log(n)<c2*n^2
So, at c1*n*log(n)=c2*n^2 the n=100 because after it c1*n*log(n)<c2*n^2
So,
c1*100*log(100)=c2*100^2
It is possible since c1/c2=100/log(100)
Below is the graph of both function and the point n=100 marked as aestrik
Kindly revert for any queries
Thanks.
Get Answers For Free
Most questions answered within 1 hours.