4 Replies - 444 Views - Last Post: 29 December 2018 - 03:23 PM

#1 RyanMco   User is offline

  • D.I.C Head

Reputation: -9
  • View blog
  • Posts: 93
  • Joined: 27-December 18

Data structure and algorithm

Posted 28 December 2018 - 05:59 PM

Hi guys; why in the definition of T(n) we are caculating the worst case of the algorithm because what's important is the worst case...my question is; why we are specifically looking just on worst case of algorith? Also how is that reflected over our life as algorithmican to just calculate the worst case of time complexity?

Thanks!
Is This A Good Question/Topic? 0
  • +

Replies To: Data structure and algorithm

#2 macosxnerd101   User is offline

  • Games, Graphs, and Auctions
  • member icon




Reputation: 12491
  • View blog
  • Posts: 45,626
  • Joined: 27-December 08

Re: Data structure and algorithm

Posted 28 December 2018 - 06:29 PM

From a theoretical standpoint, when classifying the difficulty of a problem, we need to understand the worst cases. Can an algorithm solve the problem in polynomial time for every case? Or just a few cases? This is the motivation.

Quote

Also how is that reflected over our life as algorithmican to just calculate the worst case of time complexity?


This does not make sense.
Was This Post Helpful? 0
  • +
  • -

#3 jon.kiparsky   User is offline

  • Beginner
  • member icon


Reputation: 11374
  • View blog
  • Posts: 19,406
  • Joined: 19-March 11

Re: Data structure and algorithm

Posted 28 December 2018 - 11:00 PM

Worst-case analysis gives you a basis for a performance guarantee: it will always be at least this performant, never worse. This is a much more useful claim to make than the one offered by best-case analysis, which is "it might sometimes be this performant, but never better".
Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.
Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.
Was This Post Helpful? 2
  • +
  • -

#4 RyanMco   User is offline

  • D.I.C Head

Reputation: -9
  • View blog
  • Posts: 93
  • Joined: 27-December 18

Re: Data structure and algorithm

Posted 29 December 2018 - 02:06 AM

View Postmacosxnerd101, on 28 December 2018 - 06:29 PM, said:

From a theoretical standpoint, when classifying the difficulty of a problem, we need to understand the worst cases. Can an algorithm solve the problem in polynomial time for every case? Or just a few cases? This is the motivation.


you're analog the "every case" or "few cases" by how far my worst case complexity is..... whenever its greater as its cases would be much ..... right?

View Postjon.kiparsky, on 28 December 2018 - 11:00 PM, said:

Worst-case analysis gives you a basis for a performance guarantee: it will always be at least this performant, never worse. This is a much more useful claim to make than the one offered by best-case analysis, which is "it might sometimes be this performant, but never better".
Also, analyzing for the worst case gives you insight into how the worst-case scenarios can be constructed, which helps you avoid those cases. Consider some algorithm which depends on feeding a list of elements into a binary tree, perhaps for quick retrieval. It is indeed possible to find an element in a binary tree in O(log N) time, which is very nice - but you won't get this performance if the tree becomes unbalanced. How can a tree become unbalanced in this scenario? Well, someone might present you with a sorted list. This will turn your binary tree into a linked list, which finds in O(N) time.
Once you become aware of this possibility, it doesn't take a lot of creativity to simply shuffle the list you're given before feeding it into the tree. This makes the worst-case situation vanishingly unlikely and more or less guarantees the log N performance for that part of the algorithm, with very little effort - certainly, much easier than building a balancing tree, if you had to construct your tree by hand.

Fortunately, of course, you don't really need to implement this sort of data structure by hand very often, but then this is an example.


didn't get your point well, can you please simplify it more?! thanks !
are you meaning that whenever I have insight on the worst case then I can choose which Algorithm to use?!

This post has been edited by macosxnerd101: 29 December 2018 - 10:48 AM
Reason for edit:: Fixed quote tag

Was This Post Helpful? 0
  • +
  • -

#5 jon.kiparsky   User is offline

  • Beginner
  • member icon


Reputation: 11374
  • View blog
  • Posts: 19,406
  • Joined: 19-March 11

Re: Data structure and algorithm

Posted 29 December 2018 - 03:23 PM

Quote

didn't get your point well, can you please simplify it more?! thanks !


I do try to make things as clear as possible the first time. If I could think of a better way to say it, I would have said it that way the first time. I would be happy to enlarge on any points that were not clear, if you have a more specific question. Failing that, all I can say is I mean that understanding the worst-case performance gives you insights on how to avoid that worst-case situation.

If the example didn't make sense, then I strongly suggest - nay, I implore - that you hie yourself hence to a serious course on data structures and algorithms and learn, learn, learn. Be diligent and studious for a semester and all will become abundantly clear.

Note that this does not mean "go watch some videos on youtube". This is worse than useless: in the case of algorithms is is really true that "a little knowledge is a dangerous thing" and that you should "drink deep, or taste not the Pierian spring." Why? Well, as the poet says, "there shallow draughts intoxicate the brain, and drinking largely sobers us again".
Was This Post Helpful? 3
  • +
  • -

Page 1 of 1