User:Gkhan/Asymptotic Notation
The two most important aspects of any algorithm (except for perhaps correctness) is speed and memory consumption. How do you measure these? The most obvious way perhaps is to determine a function on the size of the input to determine speed. For instance, an algorithm can be found use
units of time (in this discussion the main focus will be on speed, but it's the same for memory consumtion). Is this a good approach to determine speed of an algorithm. Yes and no. When to different algorithm is very similar in time consumption a clear-cut function might help to understand what algorithm is faster under what conditions. But in the majority of the cases this it is just to combersome to figure out an exact speed, especially in higher level languages. Conditions also vary wildly from system to system so there are never one definite function. So what do you use? What is really interesting here is not what the function is excactly but really how it grows. What does this then mean? Consider the two functions
They look quite different, but how do they behave? Let's look at a few plots of the function (f(n) is in red, g(n) in blue).
In the first, very limited plot the curves appear somewhat different. In the second they start going in sort of the same way, in the third there is only a very small difference, and at last they are virtually identical. Infact they approach n3, the dominant term. When n gets large, the other terms become minescule in comparison to n3.
As you can see, if you want to improve an algorithms non-dominant terms, it doesn't change much. What really matters is the dominant term. This is why we adopt a new notation for this. We say that:
We ignore the low-order terms. We can say that:
This gives us a way to easily compare algorithms with eachother. Bubblesort sorts in . Mergesort sorts in . Therefore mergesort is faster than bubblesort.
See "Big O notation" http://en.wikipedia.org/wiki/Big_O_notation