I don't see the contradiction. I'm unsure what the big 0 figure would be for this, but it is VERY much faster than quicksort, which I believe is O(n log(n)). I've used this for things like "sorting" and finding some unique distribution feature from a set, for many years.
It can't be used for all sets because of the limits of the range of the array, but on the ranges it can be used, it's faster by far than anything else. Logically, it MUST be, because it consists of just one increment of the array's value per set number. And of course, iterating through the set itself, which ANY solution must do. You can talk about contradictions all you want, but that's just talk. Try this with a random integer number array of 10,000 elements, and you'll KNOW exactly what I'm talking about. Compare it with Quicksort, Heapsort, FlashSort, or any other algorithm you want to use that will solve the original posted problem, even if there might be more than one duplicate number in the set, it's the same solution algorithm, clearly. Then you'll know that it's the answer, and your "contradictions" can be put to rest. adak