2 Replies - 1645 Views - Last Post: 24 April 2014 - 09:57 AM

#1 idontkn0w  Icon User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 55
  • Joined: 05-April 13

basic big Oh question, need some clarification

Posted 24 April 2014 - 06:34 AM

hey.. i attached the question and answer to the post
Attached Image
basically i understand everything outside of the black and red box.

black box: why are these two classes different ( O (nlogn) and O(n) )? is there a quick way to determine whether 5n + 2nlogn + 3 takes O(nlogn) or O(n)?

red box: how can you tell from looking at the function that it is between these two classes?

many thanks in advance. i have exams not far from now so any help is highly appreciated :)/>

Is This A Good Question/Topic? 0
  • +

Replies To: basic big Oh question, need some clarification

#2 macosxnerd101  Icon User is online

  • Self-Trained Economist
  • member icon




Reputation: 10397
  • View blog
  • Posts: 38,465
  • Joined: 27-December 08

Re: basic big Oh question, need some clarification

Posted 24 April 2014 - 07:01 AM

Note that O(n) is a subset of O(n log n). Noting the definition of Big-O, n <= n log n, which implies that 1 < log(n), which is true. Note that log(n) is an increasing function.

So in order to determine if f(n) = O(g(n)), we take L = lim(n--> infinity) f(n)/g(n). If L = 0, then f(n) = o(g(n)), which implies that f(n) = O(g(n)). Note that little-o is a strict inequality, while Big-O is a weak inequality.

If 0 < L < infinity, then f(n) = Theta(g(n)). Otherwise, f(n) = omega(g(n)) => f(n) = Omega(g(n)). Again, little-omega is the strict inequality and Big-Omega is the weak inequality.

So if we take lim(n--> infinity) (5n + 2nlogn + 3)/(n log n) = lim(n -> infinity) 5/log(n) + 2 + 3/(n log n) = 2. It converges to a constant, so 5n + 2nlogn + 3 = Theta(n log n).

For the red box question, we again note log(n) is an increasing function. So we consider f(n) = n log(n) log(n). Compare this to n log n, and we have f(n)/g(n) = log(n). So as we take a limit as n goes to infinity, we note that f(n)/g(n) tends to infinity. This implies that f(n) dominates g(n). Now if we consider lim(n-->infinity) f(n)/n2, we note the limit tends to 0. We are comparing log(n)log(n) to n, essentially. Also, by the definition of Big-O, f(n) = O(n2). So the phrase "between" implies that there is a tighter bound for f(n). However, O(n2) is a valid bound.
Was This Post Helpful? 1
  • +
  • -

#3 Dogstopper  Icon User is offline

  • The Ninjaducky
  • member icon



Reputation: 2870
  • View blog
  • Posts: 11,021
  • Joined: 15-July 08

Re: basic big Oh question, need some clarification

Posted 24 April 2014 - 09:57 AM

Thinking about it less mathematically, you can look also look at which terms contribute more and piecewise, which one dominates. So for (5n + 2nlogn + 3), we can look at 5n, 2nlogn, and 3. Intuitively, the 2nlogn is going to grow the fastest since it is larger than the other two. Then a VALID upper bound would be O(nlogn), which literally means, given a c and an n_0, there exists a c*nlogn that when n > n_0, all values exceed the given function. More simply it upper bounds it.

You can take a similar approach for the other one in the black box.

The red, as Mac said can have an upper bound of n^2, but that is not the BEST upper bound. In addition, we can intutively see that nlogn is ALWAYS < n(logn)^2 for large values of n. Therefore, it is a sctrict lower bound (though again, there may be tighter bounds). Thus, it is safe to say that the runtime is BETWEEN O(nlogn) and O(n^2), though we have to do more analysis to find out exactly where it lies.

Again, for your formal proof, you should use limits, as mac said. But sometimes I find that the intuition coming first helps orchestrate how the proof is written.
Was This Post Helpful? 1
  • +
  • -

Page 1 of 1