having learned some about algorithm analysis in my discrete structres class..Ive been wondering how other functions are affected by big O...could you have a O (1/x^2) ?....or even better... O(1/n!)...the once infamous runtime that everyone avoided would soon become the one everyone sought to acheive (as if O(1) could get any better)

It caught my attention and I thought it deserved a post on this website..so my questions are:

Can you have O(1/ some complexity ex..n^2 b^n n^b etc...)

and if so, what kind of loop would this be, i know that doubling or halving would result in log n or n log n time but not sure about 1/ anything

## 2 Replies - 1680 Views - Last Post: 25 February 2007 - 06:00 AM

##
**Replies To:** analysis of algorithms

### #2

## Re: analysis of algorithms

Posted 20 February 2007 - 01:26 AM

mattman059, on 19 Feb, 2007 - 09:53 PM, said:

having learned some about algorithm analysis in my discrete structres class..Ive been wondering how other functions are affected by big O...could you have a O (1/x^2) ?....or even better... O(1/n!)...the once infamous runtime that everyone avoided would soon become the one everyone sought to acheive (as if O(1) could get any better)

It caught my attention and I thought it deserved a post on this website..so my questions are:

Can you have O(1/ some complexity ex..n^2 b^n n^b etc...)

and if so, what kind of loop would this be, i know that doubling or halving would result in log n or n log n time but not sure about 1/ anything

It caught my attention and I thought it deserved a post on this website..so my questions are:

Can you have O(1/ some complexity ex..n^2 b^n n^b etc...)

and if so, what kind of loop would this be, i know that doubling or halving would result in log n or n log n time but not sure about 1/ anything

Hi

If you have a think about linear programming, any processing with an algorithm will never meet the criteria to meet 1/n or smaller Big O timings. The program must have an increasing function against time purely because of n and the iterations.

If you used a threaded solution, however, the potential for 1/2, 1/3 1/4(n) is there, related to the total processing power and how well the parallelism operates. Some older CPU (and OS's) may not be capable of delivering true multi-threaded capability.

You should also consider the effect of increasing the loading (n) in multithreaded programs. There is a point at which the machine will be running to many processes and simply bog down with context switching and memory swapping. Taking you from say O(1/4n) to O(n^2) very quickly (based on time to complete the processing).

Finally, there is a recommended limit of 16 independent threads in your process. While I have run 101 threads attached to a process, there was a clear level of latency (like 4 seconds) that was clearly visible in the logs after running. Also, my threads required a level of synchronisation because access to the logs was restricted to one thread at a time.

hope this helps.

### #3

## Re: analysis of algorithms

Posted 25 February 2007 - 06:00 AM

Althogh my backgound is in mathematics and I should LOVE BIG-O I really don't. I think that is because the engineers love math where you can throw out unimportant numbers and I hate to throw out numbers. I don't know, it is a personal prejudice that I should really get over.

Anyway the concept of a O(1/n) essentialy means that the process gets FASTER the longer the input is. There are some algorithms that "seem" to offer such a feat, such as searching though a text document of size M. THe longer our seach string (length N) the faster a Boyer–Moore search will go. When this is brought up for the first time there is always at least one student who thinks that this will have a complexity like O(1/N). I was one of these lucky dunces.

I know that this is an O(n) algorithm, but I can't explain exactly why.

This is why I hate Big-O (though I did ace all of my tests on it in school, then promptly forgot all I knew about using recurrence relations to calculate Big-O)

Anyway the concept of a O(1/n) essentialy means that the process gets FASTER the longer the input is. There are some algorithms that "seem" to offer such a feat, such as searching though a text document of size M. THe longer our seach string (length N) the faster a Boyer–Moore search will go. When this is brought up for the first time there is always at least one student who thinks that this will have a complexity like O(1/N). I was one of these lucky dunces.

I know that this is an O(n) algorithm, but I can't explain exactly why.

This is why I hate Big-O (though I did ace all of my tests on it in school, then promptly forgot all I knew about using recurrence relations to calculate Big-O)

Page 1 of 1