On Tuesday, 1 May 2018 at 18:46:20 UTC, H. S. Teoh wrote:

Well, yes. Of course the whole idea behind big O is asymptotic behaviour, i.e., behaviour as n becomes arbitrarily large. Unfortunately, as you point out below, this is not an accurate depiction of the real world:

[snip]

The example I like to use is parallel computing. Sure, throwing 8 cores at a problem might be the most efficient with a huge amount of data, but with a small array there's so much overhead that it's way slower than a single processor algorithm.

Reply via email to