2 Replies - 14437 Views - Last Post: 29 December 2011 - 09:57 PM Rate Topic: -----

#1 Vi Sky  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 29-December 11

How to find time complexity of an algorithm

Posted 29 December 2011 - 08:21 PM

hi to everyone.

i'm currently a student and i'm studying computer science and engineering so we are learning about algorithms but i'm in hard time and can't understand clearly how to find complexity time of a algorithm



here i have one example:


begin
Input: n(pos. Integer)
Output: y(pos. Integer)
Other: x,z(pos. Integer)

x:= 2*n
y := 0;
    while x>0 do
       y:=y+1;
          z:=0;
    while z<2 do
    x := x - 1;
    z := z+1;
end





Can someone tell me how to find time complexity for this code or to any other algorithm ??


Thanks!
help is appreciated.

Is This A Good Question/Topic? 0
  • +

Replies To: How to find time complexity of an algorithm

#2 r.stiltskin  Icon User is offline

  • D.I.C Lover
  • member icon

Reputation: 1833
  • View blog
  • Posts: 4,927
  • Joined: 27-December 05

Re: How to find time complexity of an algorithm

Posted 29 December 2011 - 09:34 PM

Well, I don't think a forum is the right place to look for a thorough understanding of complexity Many volumes have been written on the subject. I suggest you get your hands on "Introduction to Algorithms" by Cormen & Leiserson for a good introduction to this topic.

As for the example you posted -- there seems to be an error. If the input n is greater than 0, the first while loop will never terminate. It only makes sense if the second while loop is enclosed within the first one. If that's the case, you have to analyze how many operations will be performed in relation to the size of the input n. That entails considering how many operations (comparisons, arithmetic operations, and assignments) are performed by the algorithm. When loops are involved you have to consider the number of operations in each iteration of each loop, multiplied by the number of iterations.
Was This Post Helpful? 0
  • +
  • -

#3 Karel-Lodewijk  Icon User is offline

  • D.I.C Addict
  • member icon

Reputation: 449
  • View blog
  • Posts: 849
  • Joined: 17-March 11

Re: How to find time complexity of an algorithm

Posted 29 December 2011 - 09:57 PM

There is no one method that fits all. But so summarize it a little.

First you should decide what can be done in constant time and what is are the dependent variables. For example when we are talking about multiplication algorithms, then we would calculate complexity in function of the number of digits. If you want to be really formal, you could calculate the complexity in function of a turing machine. But usually we say somewhat informally that a constant set of machine instructions can be done in constant time and the dependent variable of the complexity is the number of inputs.

Complexity often depends on other characteristics of the input but the size. For example if a sorting algorithm gets data that is already sorted, then it might perform much better. Therefore unless otherwise specified we talk about the worst case complexity, But sometimes a best case complexity and certainly an average case complexity is also mentioned if it's different.

Also computer scientists usually use the big O notation which defines an upper bound for the complexity. The big O notation drops any constants factors and lower order terms. Some books also mention a big theta for tightest upper bound and a big Omega for a lower bound that work the same, others don't make that distinction.

How you go from there really depends on the code. For loops are simple enough, the loop headers basically tell you how many times a loop is executed. However when the loop variables depends on the loop variables of other loops, then it might not be so simple, then counting the operations is a good approach. For example.

for (int i = 1; i < n; i++) {
    //0(1)
}



This will clearly just run n times. When for loops are nested, but the looping variables are not dependent on each other, then you can just multiply.

When run code after other code, then their complexity should be added, not multiplied, with loops this is no different.

for (int i = 0; i < n; i++) {
   //0(1)
}
for (int j = 0; j < n; j++) {
   //0(1)
}



This is 2 loops one after the other, they each run n times. So together this takes 2n time or O(n).

When the loops are nested, but the loop variables independent, like this:

for (int i = 0; i < n; i++) {
    for (int j = 0; j < n; j++) {
        //0(1)
    }
}



Then their complexity should be multiplied. After all, the inner loop, loops n times when the outer loop loops once. So since the outer loop loops also n times, the inner loop loops n*n times or O(n^2).

But sometimes it gets a little trickier, take this example:

for (int i = 0; i < n; i++) {
    for (int j = i; j < n; j++) {
         //0(1)
    }
}



The inner loop loops n times the first time, but the second time it will only run n-1 times and so forth.

If you get into difficulties with things like this, just count the number of operations. In this case:

n + (n-1) + (n-2) + ... + 1 = 1 + 2 + ... + n

Usually you'll get some kind of mathematical series. Usually a series that is well known or looks like / can be bound by some well known series. I won't go into the mathematics here. In this case it is the sum of the natural numbers which goes (n*(n+1))/2.

So in our example:

O((n(n+1))/1) = O((1/2) * (n^2 + n)) = O(n^2). We dropped the 1/2 constant and the n because it is of a lower order than n^2.

For complex complexity calculations, this counting of operations is sometimes done on an algorithm wide scale, we talk then about Amortized analysis.

So this will run in O(n+n) or O(n).

For other kind of loops it might not be immediately obvious but there is always mostly one or a limited set of loop variables. If your algorithm is recursive, then you might want to look into something called the Master's Method.

Note that calculating the complexity based on the code is certainly not always the best way. Think about the algorithm before or after you try to implement it on a more abstract level.

Well that was the very basic primer on complexity.

As for your piece of code.

x:= 2*n
y := 0;
    while x>0 do
       y:=y+1;
          z:=0;
    while z<2 do
    x := x - 1;
    z := z+1;
end



I'll assume here n is the dependent variable. then we have 2 loops:

x:= 2*n
while (x > 0) {
   z = 0
   while (z < 2) {
      x--
      z++
   }
}



Notice how the inner loop is just executed twice or in other words a constant number of times. Then we only have the outer loop to worry about. And if you look closely it runs n times. x = 2 * n, but x decreases by 2 every loop, n times or O(n).
Was This Post Helpful? 1
  • +
  • -

Page 1 of 1