?

Log in

Previous Entry | Next Entry

sorting invariants

Of the standard fast sorting algorithms, heap and quick are in-place but unstable, merge is stable (or can be) but take O(n) extra space. You can decorate input with the original sequencing in order to break ties and make a stable sort, but now that takes O(n) extra space. Insertion sort is stable and in-place, but takes O(n^2) time; if you insert into a tree it becomes faster, but takes space again (as you're not just copying the data, but creating two pointers per item, too.)

If I understand radix sort right, it has worst case O(n*log(n)) performance (if there are n distinct entries, they take log(n) bits to be represented, and radix is O(n*numbits), and the LSD version is stable, but I think takes extra space. MSD can be stable but takes extra space for that.

So, seems like a pattern! Except Wikipedia intimates of in-place stable mergesorts in O(n*log n * log n) -- okay, that still takes extra time -- or in just O(n log n) but with high constant values. And there's blocksort, allegedly an in-place, stable, worst case O(n log n), best case O(n) algorithm. Which sounds like the Ultimate Sorting Algorithm, doing everything you'd want. Why haven't I heard of it more? Apart from seeming pretty complicated.

See the comment count unavailable DW comments at http://mindstalk.dreamwidth.org/443591.html#comments

Profile

Phoenix
mindstalk
Damien Sullivan
Website

Latest Month

July 2017
S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Tags

Powered by LiveJournal.com
Designed by Lilia Ahner