Unconfigured Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Revisiting Algorithms and Data Structures

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Revisiting Algorithms and Data Structures

    ** This thread discusses the article: Revisiting Algorithms and Data Structures **
    ** This thread discusses the Content article: Revisiting Algorithms and Data Structures0

  • #2
    Revisiting Algorithms and Data Structures

    ** This thread discusses the article: Revisiting Algorithms and Data Structures **
    Wow! That takes me back to my algorithms course in university! I remember one class where the prof came in excitedly announcing a new O(n) sorting algorithm. To demonstrate, he had a bunch of pencils of different lengths. With one hand, he held the pencils vertically with the ends on top of the table. He lowered the other hand down on top of the pencils and was able to pick out the tallest. Repeating for each pencil, he was able to sort the pencils by length in O(n) time! Here's another story from my days of working on the ILE RPG IV compiler: We used a hash table to store the dictionary. I spent a few days tuning the hash table, and one of the ways I tried was to increase the size of the hash table to reduce the chances of collision. But when I doubled the hash table size, the compile took twice as long. Huh?!? That's not right! It took a bit of time to figure out, but the slowdown was in the sort step. Sorting happened by merging consecutive hash chains, which happened to work faster with a smaller number of chains. I suppose in retrospect we could have copied all the pointers to a separate chunk of storage (an O(n) process) and done an O(n log n) sort on the items. Instead, I sped up the merge sort by merging the lists differently. Rather than merging each list into one master list, we first merged pairs of lists, then merged pairs of those, etc. I suppose my point is that algorithm analysis isn't always easy or straight-forward. Doing an optimization in one place might have unexpected consequences elsewhere. Which is why compiler optimization is a bit of black magic. If you can, use languages where these data structures are built-in. Many modern languages have lists and hash tables as built-in objects. Using what's natural in those languages can make your programming easier and more productive. Which brings up another important point about performance: Which is more important: Efficiency of the program? or efficiency of the progammer? Cheers! Hans

    Comment

    Working...
    X