As others have said this is not stated precisely. There are lots of
sorting algorithms that have O(n) behavior on nearly sorted
data.Quicksort and mergesort can both be implemented to get this
behavior. However if you are very confident that elements are not very
far from their correct sorted loca
But still it will depend on the number of distortions.If the number of
distortions are large enuf(<=n/2) then its better to resort in the most
generic case.Even insertion sort would cost o(n^2) in that case.
On Tue, Oct 18, 2011 at 3:57 AM, sravanreddy001 wrote:
> Yes.. And the reason is best cas
Yes.. And the reason is best case of insertion sort is in the order of n.
--
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To view this discussion on the web visit
https://groups.google.com/d/msg/algogeeks/-/_OJfijjiFVoJ.
To post to this grou
Insertion sort is pretty good for this. I've seen a case where we were
maintaining a list of items sorted by rank. The ranks changed
occasionally, but usually only by +/- 5 at the most, in a list of
several million. Insertion sort was much faster at putting them back
in order compared to quicksort,
OK..
what is expected?
its again sort problem, unless the amount of distortion is constant, in
which a Binary search or Insertion sort can be employed to do in O(n) time.
Didn't give a programmatic thought. But, if the the amount of distortion is
of order n, then sort in O(n lg n)
--
You recei
We can just keep swapping characters, until it gets in order.
--
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To view this discussion on the web visit
https://groups.google.com/d/msg/algogeeks/-/3VFwJUGRGS0J.
To post to this group, send emai