# Big O notation/Time complexity/Algorithm analysis

Page 1 of 1

## 7 Replies - 29873 Views - Last Post: 28 February 2010 - 06:45 PM

### #1 nebulinda

Reputation: 0
• Posts: 11
• Joined: 21-April 09

# Big O notation/Time complexity/Algorithm analysis

Posted 11 September 2009 - 10:14 AM

I'm in a 200 level CS class called Data Structures and Algorithms, and I'm having a hard time understanding how to find the time complexity of an algorithm. I don't get how you look at an algorithm and decide if it's n, or log(n), or what.

I can usually tell that if something is just assignments, that's constant, and a loop is n, and a nested loop is n^2, n^3, etc.

But then there's something like this:
```sum=0;
for(int j=1; j<n; j*=2){
sum++;
}
```

Apparently that's O(log(n)), but I have no idea where that comes from. My professor seems to expect that we can just look at an algorithm and just know what it's time complexity is.

And we had a quiz question recently that said, "For the typical algorithm that you use to perform calculations by hand, determine the running time in terms of n to a) add two n-digit numbers, and b ) multiply two n-digit numbers."

I said that adding would be constant, and multiplying would be n. But the right answer is that adding is n, and multiplying is n^2, and I don't understand why.

Is there some method by which I can figure this out?

Help appreciated, thanks!

This post has been edited by nebulinda: 11 September 2009 - 02:33 PM

Is This A Good Question/Topic? 0

## Replies To: Big O notation/Time complexity/Algorithm analysis

### #2 William_Wilson

• lost in compilation

Reputation: 207
• Posts: 4,812
• Joined: 23-December 05

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 11 September 2009 - 11:28 AM

It takes practice and you need to really understand how code works and more importantly how logarithmic functions/loops work. Anything that is big O of a log, usually is effectively the result of an efficient binary tree approach. (At least this is the best way I found to visualize it).

In the example you gave, the values looped are between 1 and n. That much I assume you could already determine, which means a worst case of n, if we were to search through each element. i.e. a linear search. However the loop skips through at j *= 2, which doubles the value of j each iteration of the loop.
*This is where the binary tree part comes in. This is easily mapped to the path traversing a binary tree. (Assuming it is a balanced tree)

A short example:
```	 o
/  \
o	o
/ \   / \
o   o  o   o

```

In this diagram, n would be 7. To traverse linearly it would take 7 iterations, but at j*=2, we are making a decision of left or right at each branch, cutting the tree to be searched in half with each iteration, which as far as the example goes would be 3 in a worst case. Thus it runs in log(n) which is MUCH faster than linear.

I hope that clears it up a little, if not, feel free to ask more questions

### #3 mojo666

Reputation: 408
• Posts: 882
• Joined: 27-June 09

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 11 September 2009 - 01:02 PM

log(n) means you need to double (or triple, quadruple, ect...) the input size in order to increase run time by 1. If log(n)=t then log(2*n) = t+1. In the example you gave, doubling n will result in only one additional loop. Doubling that will result in another loop.

For the quiz problem, try and code it to at least the looping parts of the algorithm.

```int n1[n]
int n2[n]
for int i = 1 to n
result[i] = n1[i] + n2[i]
//you add each of n digits of the two numbers.  This is order n.

```

```int n1[n]
int n2[n]
for int i = 1 to n
for int j = 1 to n
result[i] = result[i] + n1[i] * n2[j]
//One nested loop since you need to multiply each digit of n1 with each digit of n2.  This is O(n^2).

```

This post has been edited by mojo666: 11 September 2009 - 01:03 PM

Reputation:

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 15 September 2009 - 05:50 PM

How many times would you have to multiply 1 by 2 to get a number that is equal to or greater than n? Assume that n is some power of 2. In that case, you would have to perform the multiplication logn times. Otherwise you'd multiply it logn + 1 times. Both are O(logn) operations.

Reputation:

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 28 February 2010 - 05:57 PM

nebulinda, on 11 September 2009 - 09:14 AM, said:

I'm in a 200 level CS class called Data Structures and Algorithms, and I'm having a hard time understanding how to find the time complexity of an algorithm. I don't get how you look at an algorithm and decide if it's n, or log(n), or what.

I can usually tell that if something is just assignments, that's constant, and a loop is n, and a nested loop is n^2, n^3, etc.

But then there's something like this:
```sum=0;
for(int j=1; j<n; j*=2){
sum++;
}
```

Apparently that's O(log(n)), but I have no idea where that comes from. My professor seems to expect that we can just look at an algorithm and just know what it's time complexity is.

And we had a quiz question recently that said, "For the typical algorithm that you use to perform calculations by hand, determine the running time in terms of n to a) add two n-digit numbers, and b ) multiply two n-digit numbers."

I said that adding would be constant, and multiplying would be n. But the right answer is that adding is n, and multiplying is n^2, and I don't understand why.

Is there some method by which I can figure this out?

Help appreciated, thanks!

### #6 erik.price

• D.I.C Lover

Reputation: 486
• Posts: 2,690
• Joined: 18-December 08

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 28 February 2010 - 05:59 PM

...why did you reply to an old post just to quote someone else?

Reputation:

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 28 February 2010 - 06:16 PM

nebulinda, on 11 September 2009 - 09:14 AM, said:

I'm in a 200 level CS class called Data Structures and Algorithms, and I'm having a hard time understanding how to find the time complexity of an algorithm. I don't get how you look at an algorithm and decide if it's n, or log(n), or what.

I can usually tell that if something is just assignments, that's constant, and a loop is n, and a nested loop is n^2, n^3, etc.

But then there's something like this:
```sum=0;
for(int j=1; j<n; j*=2){
sum++;
}
```

Apparently that's O(log(n)), but I have no idea where that comes from. My professor seems to expect that we can just look at an algorithm and just know what it's time complexity is.

And we had a quiz question recently that said, "For the typical algorithm that you use to perform calculations by hand, determine the running time in terms of n to a) add two n-digit numbers, and b ) multiply two n-digit numbers."

I said that adding would be constant, and multiplying would be n. But the right answer is that adding is n, and multiplying is n^2, and I don't understand why.

Is there some method by which I can figure this out?

Help appreciated, thanks!

You could try using my 'Big O Analyser for .NET to find the relationship...

http://www.codeproje...gOAnalyzer.aspx

### #8 erik.price

• D.I.C Lover

Reputation: 486
• Posts: 2,690
• Joined: 18-December 08

## Re: Big O notation/Time complexity/Algorithm analysis

Posted 28 February 2010 - 06:45 PM