9 Replies - 2637 Views - Last Post: 14 September 2009 - 01:29 PM

#1 needhelp83  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 11-September 09

Time Complexity

Posted 11 September 2009 - 04:14 PM

Alright, I didn't get this one quite right on the homework, but I am trying to figure out exactly how to solve this thing. The answer was n log n , but how do you total this up when going through the loop



i =  n										  1
   while i >= 1							  n
	  j =  i									 1
		 while j <= n						log n?
			  <body of the j loop>,	  \\Theta(1) 
			  j =j*2							1
		 end j while 
	 i =i-1									  1
end i while




Now when I total up I disregard so I left with n. Now I assume that the inner loop is log n, but I don't fully understand. Any help??

Is This A Good Question/Topic? 0
  • +

Replies To: Time Complexity

#2 Guest_Neumann*


Reputation:

Re: Time Complexity

Posted 11 September 2009 - 04:57 PM

I don't think that completely writing out the solution, including all the mathematical lingo, will be of much help here, so I'll be brief. Yes, the first while loop is O(n) because it iterates n times. In the inner loop, j gets multiplied by 2 until it becomes greater than n. The initial value of j iterates from 1 to n. So it will have to be multiplied at least once, and at most log n times. When it equals to n, it will be multiplied once. when it equals to 1 it will be multiplied log n times. Therefore the inner loop is O(log n).
Was This Post Helpful? 1

#3 needhelp83  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 11-September 09

Re: Time Complexity

Posted 11 September 2009 - 06:38 PM

View PostNeumann, on 11 Sep, 2009 - 03:57 PM, said:

I don't think that completely writing out the solution, including all the mathematical lingo, will be of much help here, so I'll be brief. Yes, the first while loop is O(n) because it iterates n times. In the inner loop, j gets multiplied by 2 until it becomes greater than n. The initial value of j iterates from 1 to n. So it will have to be multiplied at least once, and at most log n times. When it equals to n, it will be multiplied once. when it equals to 1 it will be multiplied log n times. Therefore the inner loop is O(log n).


Awesome, now with that said I am assuming you multiply the outer loop by the inner loop to give me n log n and I just drop the 1's?
Was This Post Helpful? 0
  • +
  • -

#4 KYA  Icon User is offline

  • g++ jameson.cpp -o beverage
  • member icon

Reputation: 3116
  • View blog
  • Posts: 19,154
  • Joined: 14-September 07

Re: Time Complexity

Posted 12 September 2009 - 04:47 PM

Yup, read me for the simple rules of Big O

You drop constants.
Was This Post Helpful? 0
  • +
  • -

#5 needhelp83  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 11-September 09

Re: Time Complexity

Posted 13 September 2009 - 02:19 PM

View PostKYA, on 12 Sep, 2009 - 03:47 PM, said:

Yup, read me for the simple rules of Big O

You drop constants.


Thanks KYA, this link was amazing help!
Was This Post Helpful? 0
  • +
  • -

#6 Guest_Neumann*


Reputation:

Re: Time Complexity

Posted 14 September 2009 - 12:36 PM

Sorry, was away for couple days...

View Postneedhelp83, on 11 Sep, 2009 - 05:38 PM, said:

Awesome, now with that said I am assuming you multiply the outer loop by the inner loop to give me n log n

No, I'd have to use some math to explain exactly what happened - it involves a ceiling function (http://en.wikipedia.org/wiki/Floor_and_ceiling_functions) and a simple inequality. You can think that that's what I did for now, but be aware that it's not how it was calculated.

Quote

and I just drop the 1's?

No, some of those 1's stay - they are your basic operations, and you have to calculate how fast those 1's will grow as n grows. That's the first part of analyzing algorithms - determining the basic operation. The other 1's, like those outside your loops, may be ignored - they don't contribute much of anything.
Was This Post Helpful? 0

#7 KYA  Icon User is offline

  • g++ jameson.cpp -o beverage
  • member icon

Reputation: 3116
  • View blog
  • Posts: 19,154
  • Joined: 14-September 07

Re: Time Complexity

Posted 14 September 2009 - 12:41 PM

Yeah, it depends on context, but generally, the rule of thumb is to drop constants.

i.e 2n->n
n+1->n etc...

This post has been edited by KYA: 14 September 2009 - 12:43 PM

Was This Post Helpful? 0
  • +
  • -

#8 Guest_Neumann*


Reputation:

Re: Time Complexity

Posted 14 September 2009 - 01:02 PM

Well, those constants could be dropped after you're done all the calculations, in order to make the Big-O look "simpler". But while calculating the Big-O, what is usually being done is increasing the amount operations being performed, because that's the nature of Big-O. So,

n+1 < n + n = 2n (for n greater than 1)
But such constants drop off at the end of the calculation, anyway. Most of the time, it's just a matter of formality. Let me add that parts of the answer with the lesser big-O can be dropped off too, so,

O(n logn + n) = O(n log n) + O(n) = O(n logn). In fact, that's what happened in this case. My final answer was,

1/2 * n * logn + n which can be considered as O(n logn).
Was This Post Helpful? 0

#9 KYA  Icon User is offline

  • g++ jameson.cpp -o beverage
  • member icon

Reputation: 3116
  • View blog
  • Posts: 19,154
  • Joined: 14-September 07

Re: Time Complexity

Posted 14 September 2009 - 01:21 PM

Right, which isn't the case of a constant, but rather keeping the term with the largest growth rate.
Was This Post Helpful? 0
  • +
  • -

#10 Guest_Neumann*


Reputation:

Re: Time Complexity

Posted 14 September 2009 - 01:29 PM

No, it wasn't the case of a constant, that's why I said "too". I was simply making a general notion - no operations should be dropped off while calculating the Big-O.
Was This Post Helpful? 0

Page 1 of 1