Hi guys; why in the definition of T(n) we are caculating the worst case of the algorithm because what's important is the worst case...my question is; why we are specifically looking just on worst case of algorith? Also how is that reflected over our life as algorithmican to just calculate the worst case of time complexity?

Thanks!

# Data structure and algorithm

Page 1 of 1## 4 Replies - 448 Views - Last Post: 29 December 2018 - 03:23 PM

##
**Replies To:** Data structure and algorithm

### #2

## Re: Data structure and algorithm

Posted 28 December 2018 - 06:29 PM

From a theoretical standpoint, when classifying the difficulty of a problem, we need to understand the worst cases. Can an algorithm solve the problem in polynomial time for every case? Or just a few cases? This is the motivation.

This does not make sense.

Quote

Also how is that reflected over our life as algorithmican to just calculate the worst case of time complexity?

This does not make sense.

### #3

## Re: Data structure and algorithm

Posted 28 December 2018 - 11:00 PM

Worst-case analysis gives you a basis for a performance guarantee: it will always be at least this performant, never worse. This is a much more useful claim to make than the one offered by best-case analysis, which is "it might sometimes be this performant, but never better".

Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.

Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.

Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.

Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.

### #4

## Re: Data structure and algorithm

Posted 29 December 2018 - 02:06 AM

macosxnerd101, on 28 December 2018 - 06:29 PM, said:

From a theoretical standpoint, when classifying the difficulty of a problem, we need to understand the worst cases. Can an algorithm solve the problem in polynomial time for every case? Or just a few cases? This is the motivation.

you're analog the "every case" or "few cases" by how far my worst case complexity is..... whenever its greater as its cases would be much ..... right?

jon.kiparsky, on 28 December 2018 - 11:00 PM, said:

Worst-case analysis gives you a basis for a performance guarantee: it will always be at least this performant, never worse. This is a much more useful claim to make than the one offered by best-case analysis, which is "it might sometimes be this performant, but never better".

Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.

Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.

Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.

Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.

didn't get your point well, can you please simplify it more?! thanks !

are you meaning that whenever I have insight on the worst case then I can choose which Algorithm to use?!

This post has been edited by **macosxnerd101**: 29 December 2018 - 10:48 AM

Reason for edit:: Fixed quote tag

### #5

## Re: Data structure and algorithm

Posted 29 December 2018 - 03:23 PM

Quote

didn't get your point well, can you please simplify it more?! thanks !

I do try to make things as clear as possible the first time. If I could think of a better way to say it, I would have said it that way the first time. I would be happy to enlarge on any points that were not clear, if you have a more specific question. Failing that, all I can say is I mean that understanding the worst-case performance gives you insights on how to avoid that worst-case situation.

If the example didn't make sense, then I strongly suggest - nay, I implore - that you hie yourself hence to a serious course on data structures and algorithms and learn, learn, learn. Be diligent and studious for a semester and all will become abundantly clear.

Note that this does not mean "go watch some videos on youtube". This is worse than useless: in the case of algorithms is is really true that "a little knowledge is a dangerous thing" and that you should "drink deep, or taste not the Pierian spring." Why? Well, as the poet says, "there shallow draughts intoxicate the brain, and drinking largely sobers us again".

Page 1 of 1