hey here how will you find the median over the billions of numbers when all
data doesnt fit at the same time in memory??

On Mon, Dec 12, 2011 at 6:41 AM, bharath sriram <bharath.sri...@gmail.com>wrote:

> Hey group,
>
> This is kind of a cliched question but given a file with billion numbers
> and the task is to compute 'k' largest numbers from this file, what
> approach is preferred?
> 1) Using heaps
> 2) Using Median-of-median algorithm.
>
> Have read few links which prefer heaps but clearly median of median
> algorithm has a linear time complexity and don't see how its any less if
> not better than using heaps?
> Any thought?
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Algorithm Geeks" group.
> To post to this group, send email to algogeeks@googlegroups.com.
> To unsubscribe from this group, send email to
> algogeeks+unsubscr...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/algogeeks?hl=en.
>



-- 
Thanks and Regards,
Sumit Mahamuni.

-- Slow code that scales better can be faster than fast code that doesn't
scale!
-- Tough times never lasts, but tough people do.
-- I love deadlines. I like the whooshing sound they make as they fly by. -
D. Adams

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to algogeeks@googlegroups.com.
To unsubscribe from this group, send email to 
algogeeks+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to