4 Replies - 1104 Views - Last Post: 24 September 2010 - 03:48 PM

#1 Guest_Xecure*


Reputation:

worst case speed for accessing/modyfying data

Posted 22 September 2010 - 08:47 PM

other then letting the user know the max time it will take for a java program to go through a data structure to find the data what other benefit is there to knowing the worst case speeds for a program?
Is This A Good Question/Topic? 0

Replies To: worst case speed for accessing/modyfying data

#2 anonymouscodder   User is offline

  • member icon

Reputation: 126
  • View blog
  • Posts: 710
  • Joined: 01-January 10

Re: worst case speed for accessing/modyfying data

Posted 22 September 2010 - 09:10 PM

You have to show what you already did. And next time post in the proper forum.
Was This Post Helpful? 0
  • +
  • -

#3 guahguahmonster   User is offline

  • D.I.C Head
  • member icon

Reputation: 68
  • View blog
  • Posts: 209
  • Joined: 29-August 07

Re: worst case speed for accessing/modyfying data

Posted 24 September 2010 - 12:44 AM

Eh, knowing the time complexity of data structures might be more computer sciencey in general than Java-specific.

One good reason to know is that the coder can pick and choose the correct data structure to hold his data based on what operations need to be fast. Consider the AVL tree and the red-black tree:

Quote

The AVL tree is more rigidly balanced than Red-Black trees, leading to slower insertion and removal but faster retrieval.


So if the programmer thinks modifications to the tree are relatively rare compared to accesses, he might decide that am AVL tree is more suitable for his needs.

Programming is about tradeoffs a lot of the time. You have to figure out what's important to you and choose the best tool to suit your needs.
Was This Post Helpful? 0
  • +
  • -

#4 auggiecc87   User is offline

  • New D.I.C Head
  • member icon

Reputation: 6
  • View blog
  • Posts: 48
  • Joined: 09-March 09

Re: worst case speed for accessing/modyfying data

Posted 24 September 2010 - 06:12 AM

Along with this:

View Postguahguahmonster, on 23 September 2010 - 11:44 PM, said:

Programming is about tradeoffs a lot of the time. You have to figure out what's important to you and choose the best tool to suit your needs.


From a Project Manager (or Teacher)'s point of view, being able to determine what the worst case of your code run time is displays a knowledge of the code you have just written.
Also, being able to justify the use of one data type instead of another is easily displayed by the big O notation. This may seem trivial in school work or sample coding exercises, but think on the scale of a search engine search and imagine if the search engine data was stored as linked-lists or some other do-able but very inefficient way.
Was This Post Helpful? 0
  • +
  • -

#5 mojo666   User is offline

  • D.I.C Addict
  • member icon

Reputation: 409
  • View blog
  • Posts: 883
  • Joined: 27-June 09

Re: worst case speed for accessing/modyfying data

Posted 24 September 2010 - 03:48 PM

View PostXecure, on 22 September 2010 - 07:47 PM, said:

other then letting the user know the max time it will take for a java program to go through a data structure to find the data what other benefit is there to knowing the worst case speeds for a program?

Time complexity doesn't really demonstrate time, it demonstrates relative growth. If you have a small amount of data and calulate the time, even exponential algorithms look acceptable. However, as more data gets added the efficiency gets crippled. Time complexity is not about "how fast the algorithm will run for k items", it's purpose is to show "how much the algorithm will slow down when I add k items".

For example, say we have an O(2^n) algorithm that takes half an hour to process 30 items. You might be fine with this amount of time. However, a week later you add 3 items. Now the same program will take 4 hours to process these items. Is it still acceptable?
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1