29/07/2011

Time Complexity

Time Complexity of an algorithm is the function that tells the amount of time which the algorithm should take in terms of the size of the input of the algorithm."Time" here means the processing time taken during the run-time of the algorithm.Thus all step counts are added to calculate the total run-time required.This may be a constant or some function of n.
There are certain asymptotic notation with which we express the complexities.I will deal with that in my next post.Here I will focus more on the type of time complexities.
Since different algorithm may take a different amount of time even on inputs of the same size, the most commonly used measure of time complexity, the worst-case time complexity of an algorithm, denoted as T(n), is the maximum amount of time taken on any input of size n. Time complexities are classified by the nature of the function T(n). For example an algorithm may be linearly dependent on the input size or it may be cubically or even exponentially.We express this as O(g(n)) read as "Ohh(g(n))". If there are two algorithms written for the same problem one of time complexity O(n) and other of suppose O(n^2) and the input size is made 4 times , then the latter will take (4)^2 that is 16 times more time it was taking previously while the linear algorithm will take just 4 times more time.Hence it is a better algorithm.

Generally,the different time complexities that we come across are linear[ O(n) ], quadratic [O[n^2) ], cubic [O(n^3)] ,logarithmic (O(lg n)) and exponential [O(2^n)]. Here the "lg n" means "log  n to the base 2".It is evident that:
Time Complexities of different functions
O(lg n) < O(n) < O(n lg n) < O(n^2) <O(n^3) <O(2^n) for all n>1.

No comments:

Post a Comment