Hi!

Consider a sorted array. Append an element that is less than all the previous elements. Then sort the array again by the sort function from std.algorithm.

An example follows:

-----
import std.algorithm;
import std.datetime;
import std.range;
import std.stdio;

void main ()
{
        int n = 30_000;
        auto a = array (iota (n)) ~ [0];
        auto st = Clock.currTime ();
        sort (a);
        auto et = Clock.currTime ();
        writeln (et - st);
}
-----

With n = 30_000 as in the example, this takes time of the order of a second on a modern computer, which is clearly O(n^2). I am using DMD 2.062.

Well, now I have a point to make and a question.

My point is that I think this is bad because such a pattern (sorted array ~ small element, then sort) is fairly likely to occur - well, it did for me, hence the post. Sure, every fast and simple method for choosing the pivot in quicksort will have its O(n^2) corner cases. But there's a difference between a corner case that never occurs in "real" data (but still could be constructed artificially) and a corner case describable by such a simple pattern.

The question is rather predictable, really: what is the guaranteed-n-log-n replacement for sort in the standard library? I've found the following way...
        sort !("a < b", SwapStrategy.stable) (a);
...but it forces to specify the redundant "a < b" and looks a bit clumsy for an everyday sort() call.

Tried to keep it short(er) this time,

Ivan Kazmenko.

Reply via email to