Post date: Oct 21, 2013 5:53:42 AM
Order of Growth: Whenever we work with algorithms, the most critical factor considered for them is their efficiency. And for this we are interested in analyzing how the running time increases when input size increases. when to algorithms are compared with respect to their behavior for large input size, the measure is know as Order of growth. The order of growth can be estimated by taking into account the dominant term of the running expression. The dominant term overpasses the values of the other terms when the input size is large.
- Let us consider T(n) = an + b where a>0, If the input size 'n' is multiplied by 'k' then the dominant term in T(n) i.e. 'an' is also multiplied by 'k'. This produces T(kn) = k(an)+b and we can say that T(n) has a linear order of growth.
- Let us consider T(n) = an2+ bn + c where a>0, If the input size 'n' is multiplied by 'k' ,this produces T(kn) = (k2a)n2+ (kb)n + c, thus the dominant term is multiplied by k2 and we can say that T(n) has a quadratic order of growth. and so on for cubic, logarithmic etc.
Thus for efficiency with respect to large problems the analysis is made for large values of the input size and dominant terms plays and important role. The analysis that is conducted for large values of input size is called as asymptotic analysis.
In Asymptotic analysis it is considered that an algorithm 'A1' is better than algorithm 'A2' if the order of growth of the running time of the 'A1' is lower than that of 'A2'. Therefore asymptotic efficiency of algorithms are concerned with how the running time of an algorithm increases with the size of the input in the limit, as the size of the input increases without bound.
Asymptotic Notations: For every algorithm corresponding to efficiency analysis, we have three basic cases :
- Best Case
- Average Case
- Worst Case
And there are 5 notations to resemble them: