Yes, its a quite an interesting problem and solved using parallel
programming many times.
I think the idea , as Ercan thought of, is to do a binary search.

Consider 2 arrays A and B.
Divide both arrays into logn sizes blocks.
For each block let the first element be the head element. Now cross
find the location of one head into another. that is find the position
of a head at i in array A in array b, if this position is j, then we
can say that the net position is i+j. So in an array of size |a| + |b|
we can have these heads stored. This takes logn*logn time. now the
matter remains of merging the values inbetween these heads. there are
logn such elements between each two consecutive heads.
It can be proved that there are atmost 2logn elements in between 2
heads and if we can merge them parallel, we will take 2logn time
totally. Actually we can find the exact location of an element i of A
in B as j, and this j acts as an additional marker. I hope you got the
idea, things can be worked out a little more clearly.



On Nov 9, 12:34 am, Gönenç Ercan <gon...@gmail.com> wrote:
> Ok I see your point, I thought that it asked to provide an algorithm
> with overall complexity O(logn), which is not possible.
>
> why not use a binary search like procedure to fix the worst case.
> Choose the array lets say list L with the bigger number, then instead
> of checking each element of smaller list S one by one, execute binary
> search in S for the number in L_1. This will find the index of items
> smaller than L_1, insert them to the merged array, and L_1. repeat
> this untill all elements are processed. While this involves a single
> item only O(logn) times, the overal complexity of merge is still O(n)
> because the items should be copied in any case.
>
> On Nov 8, 12:01 pm, Terence <technic....@gmail.com> wrote:
>
>
>
>
>
>
>
> > No. It is possible.
>
> > This problem focuses on comparisons of each element.
> > The overall time complexity of merge operation can be as high as O(nlogn),
> > while the normal merge operation has time complexity O(n).
>
> > But the normal merge operation does not meet the requirement, since in
> > the worst case, (the largest number in one sequence is smaller than the
> > smallest of the other),  one element can be included in n comparisons at
> > most.
>
> > On 2010-11-8 17:00, G nen Ercan wrote:
>
> > > does not seem possible, there is a proof that shows that comparison
> > > based algorithm can have at best O(nlogn) worst case running time. You
> > > can check this proof from algorithms CLRS book.
>
> > > Using this proof, by master's theorem if the merge operation can be
> > > implemented in O(lgn) using merge sort with this merge operation, the
> > > complexity would be O(logn) contradicting with O(nlogn).
>
> > > So I think its not possible to design such merge algorithm.
>
> > > On Nov 8, 4:58 am, lichenga2404<lichenga2...@gmail.com>  wrote:
> > >> Suppose we'd like to implement a sorting variant where every element
> > >> is compared only a small number of times.
> > >>   a) devise a divide and conquer algorithm to merge 2 sorted arrays of
> > >> length n , which guarantees that every element is included in at most
> > >> O(log n ) comparisons.
> > >>   b) using this modified merge , prove that Merge-Sort will include
> > >> element in at most O( (log n)^2) comparisons.

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to algoge...@googlegroups.com.
To unsubscribe from this group, send email to 
algogeeks+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to