5 Replies - 10640 Views - Last Post: 24 April 2010 - 02:23 PM

#1 Guest_Tony*


Reputation:

little oh notation

Posted 24 April 2010 - 11:26 AM

Dear all,

I'm doing a self study of asymptotic notation and basic Computer Science, but I'm having some trouble understanding this definition of little o notation

"Intuitively, in o-notation, the function f(n) becomes insignificant relative to g(n) as n approaches infinity, that is:

lim x-> infinity f(n)/g(n) = 0"


I understand what little o notation tells us about an algorithm, but I don't understand what the above limit of the functions tells me?

little o notation tells us for any constant c that the function f(n) bound from above by cg(n). It is not asymptotically tight bound. That I do understand.

Is This A Good Question/Topic? 0

Replies To: little oh notation

#2 Penzyak   User is offline

  • New D.I.C Head

Reputation: 3
  • View blog
  • Posts: 26
  • Joined: 22-April 10

Re: little oh notation

Posted 24 April 2010 - 11:52 AM

In Calculus (don't know if you've taken it or not) when the denominator grows faster than the numerator, the fraction approaches 0, as the rate continues to go to infinity. So lets say:

f(x) = x
g(x) = x^2

Now consider f(x)/g(x) at x = 1, x = 2, x = 3, etc... You will notice that this fraction gets smaller and smaller every time. As x will go to infinity, the fraction will approach 0. This is another way of saying that f(x) becomes insignificant of g(x), at large values of x. Yet another way to say this is that f(x) grows a lot slower than g(x) and gets dominated by it at some point.

P.S. I read that you said "little o notation tells us for any constant c that the function f(n) bound from above by cg(n)". This is true. However, since this is usually associated with Big-O, let me give an example of the differences between these notations:

say f(x) = x and g(x) = x. Then f(x) is bound from above by 2*g(x). This means that f(x) belongs to O(g(x)). However, f(x)/g(x) as x approaches infinity is not 0. It is 1/2. Therefore, f(x) does not belong to o(g(x)). So basically, little-o only includes functions that strictly dominate (think of > sign). Whereas Big-O can also include functions that have the same asymptotic growth (think of >= sign).

This post has been edited by Penzyak: 24 April 2010 - 12:07 PM

Was This Post Helpful? 0
  • +
  • -

#3 Guest_Tony*


Reputation:

Re: little oh notation

Posted 24 April 2010 - 12:49 PM

@Penzyak: thank you very much for your explanation.

So if I understand it correctly, little-o tells us that g(x) dominates over f(x) and f(x) is always < (smaller) then g(x). The fraction basically proves to us that g(x) is the dominant function.

Now when you have small omega of cg(x), which states that:
lim n -> infinity f(n)/g(n) = infinity

that f(n) dominates here as limit of n approaches zero. So f(x) in this case is always greater than > g(x) and therefore is the asymptotic lower bound of f(x). It is for a all constants c of cg(x).

Does my understanding sound correct?
Was This Post Helpful? 0

#4 Penzyak   User is offline

  • New D.I.C Head

Reputation: 3
  • View blog
  • Posts: 26
  • Joined: 22-April 10

Re: little oh notation

Posted 24 April 2010 - 01:59 PM

Your understanding is almost correct. For little-o, f(x) does not have to be smaller than g(x) for all x. It has to be smaller only after a certain value of x. For example:

let f(x) = 5000x and g(x) = x^2

f(x) / g(x) as x approaches infinity is 0, so f(x) is litte-o of g(x). However, at x = 1, f(x) is greater than g(x). Only after x becomes greater than 5000 will g(x) be bigger than f(x).

What little-o really tells us is that g(x) always grows at a faster rate than f(x). For example, look how much f(x) grew between x = 1 and x = 2:

f(1) = 5000
f(2) = 10000 - f(x) became twice as big

Now look at g(x) on the same interval:

g(1) = 1
g(2) = 4 - g(x) became four times bigger

This rate will increase even more at bigger values of x. Now, since g(x) increases at a faster rate and because we take x to the infinity, at some point it will become larger than f(x). But this is not what little-o is concerned with, it's all about the rate of change.

Now, little omega is the same thing, except it goes other way around. So just swap f(x) with g(x) in the above discussion.
Was This Post Helpful? 0
  • +
  • -

#5 Guest_Tony*


Reputation:

Re: little oh notation

Posted 24 April 2010 - 02:13 PM

Thanks again for this information, very helpful!!! :)

Now, o(g(n)) is obviously, as you said a measure of growth, so if we're talking in terms of algorithms now, then what little o is telling us, is that
when n becomes infinitely huge, then g(n) will also be very big for f(n) and so if we're talking in performance (time taken) then it would take a LONG time for the algorithm to finish.

Am I thinking right here?

So algorithms with a o(g(n)) where g(n) = n^2 would take a very long time, as the input size got very big, and computing f(n) using that algorithm would become 'insignificant' in comparison to the amount of time it takes to do it.

As the growth is exponential.
Was This Post Helpful? 0

#6 Penzyak   User is offline

  • New D.I.C Head

Reputation: 3
  • View blog
  • Posts: 26
  • Joined: 22-April 10

Re: little oh notation

Posted 24 April 2010 - 02:23 PM

Yes, pretty much. Though if you will study algorithms you will mostly deal with Big-O, little-o is more of a mathematical tool. Also, the growth in your example will be quadratic, not exponential. Exponential is when you have something like 2^x. But you have the right understanding.
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1