Aug 11, 2013

Run time complexity- Analysis of loops

Its' the very first thing one should understand before studying algorithm - what is running time complexity of block of code/How to find running time complexity ?
For beginner it's very important to understand running complexity for simple loops since it is commonly asked question in preliminary interview rounds. I came across such stuff in my interviews so documented for my reference and others too :)

We will proceed with some sample code , associated complexity and  small analysis notes.
  • O(1) or Constant time complexity : Constant time complexity means - Time taken to execute a particular set of code is independent of size of input.In other words, We can represent a constant upper bound to how long the program will take to run which isn't affected by any of the input parameters. "Time complexity of a function (or set of statements) is considered as O(1): if it doesn't contain loop, recursion and call to any other non-constant time function. Please note that a loop or recursion that runs constant number of times is also considered as running time of complexity O(1)". lets consider following snippet code
     for (int i = 1; i <= c; i++) { 
    //some operation of constant complexity i.e : Swap operation.
    Analysis :  Since loop is running constant number of times, so complexity of this for loop is O(1). Please note that swap operation is having complexity of O(1). It is also good question to confuse somebody specially in interview.
  • Order of O(n) complexity or linear time complexity : linear time complexity means - Time taken to complete the execution of the block of code is "order of size of input". Time Complexity of a running loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount. lets consider following code :
     for (int i = 1; i <= n; i += c) { 
    // some O(1) expressions
    //Here n is input size and c is constant
    //which is incremented each time.
    Analysis: Since loop is running in order of n times and it is incremented each time by a constant , so running time of this loop is O(n). Please note instead of increment operation with constant, if we decrements with same constant and then also running time will be o(n) itself. After modifying for loop will look like this :
      for (int i = n; i > 0; i - = c) {  // i is decremented by c
       // some O(1) expressions
  • Order of O(n^2) time complexity/Nested looping complexity:Time complexity of nested loops is equal to the number of times the innermost statement is executed. Lets consider :
     for (int i = 1; i <=n; i += c) {
    for (int j = 1; j <=n; j += c) {
    //innermost statement with respect to outer loop
    // some O(1) expressions
    Analysis : Here innermost loop is running n + n-1+n-2 ..............+1 times and sum-up values comes out as n(n+1)/2 so it is order of O(n2). Selection sort and Insertion Sort have O(n2) time complexity.
    Please note : In above snippet code, we have increment operation by constant value. It matters lets see below how ?
  • Order of O(Logn) time complexity : Now we will change the increment section in for loop from a constant to a variable. we see that run time complexity of loop has improved from O(n) to O(Logn). Lets consider:
     for (int i = 1; i <=n; i *= c) {
    // some O(1) expressions
    Analysis : In for loop increment operator is multiplied by a constant so running time leads to   O(Logn). So how it happened ? Lets take a simple example and see how it happened. lets assume c = 2 for simplicity. The size of the input cut down by some constant factor on each iteration. The algorithm must terminate after O(log n) iterations, because after doing O(log n) divisions by a constant, the algorithm must shrink the problem size down to 0 or 1.If n = 16 ,  How many times can you divide n by two before you get a number less than or equal to one ? Its 4 .If n = 128 , How many times can you divide n by two before you get a number less than or equal to one ? Its 7 .
    o what is happening..For 16 , log2 16 = 4.For 128 , log2 128 = 7 and so on.......we can conclude that on diving by 2 each time we are reducing size by 2 so instead of taking n iteration it will take only O(log n) iterations.
    Reference links :  Algorithm to have o(log n) complexity
  • Order of O(log log n) time complexity: Now we will add some more twist in loops. What will happen : Instead of multiplying or dividing by constant we take a square root of loop variable or take square of it or loop variable is incremented exponentially. Lets consider this code:
     for (int i = 2; i <=n; i = pow(i, c)) { 
    // some O(1) expressions

    Analysis : Run time complexity will be O(log log n). Below mentioned link best explains how it happens, An algorithm to have O(log log n) complexity

Reference :  geeksforgeeks

Thanks and happy learning!!

Location: Coimbatore, Tamil Nadu, India