On 6/28/12 8:57 AM, Roman D. Boiko wrote:
On Thursday, 28 June 2012 at 12:29:14 UTC, Andrei Alexandrescu wrote:
On 6/28/12 5:58 AM, Roman D. Boiko wrote:
Pedantically speaking, sheer timings say nothing without the
appropriate baselines.

Andrei

I used results of benchmarks for two such algorithms, which I like most,
taken from here:

Vigna, S. (2008). "Broadword implementation of rank/select queries".
Experimental Algorithms: 154–168.

http://en.wikipedia.org/wiki/Succinct_data_structure#cite_ref-vigna2008broadword_6-0


Numbers should be valid for some C/C++ code executed on a machine that
already existed back in 2008. I'm not sure there is a good baseline to
compare. One option would be to benchmark random access to code points
in a UTF-32 string. I also don't know about any D implementations of
these algorithms, thus cannot predict how they would behave against
dstring random access.

But your statement that these timings say nothing is not fair, because
they can be used to conclude that this speed should be enough for most
practical use cases, especially if those use cases are known.

Well of course I've exaggerated a bit. My point is that mentioning "200 ns!!!" sounds to the uninformed ear as good as "2000 ns" or "20 ns", i.e. "an amount of time so short by human scale, it must mean fast". You need to compare e.g. against random access in an array etc.


Andrei

Reply via email to