If the interviewer actually said 10^80 data items, he/she was
expecting you to say it's a silly question.

Do you realize how much 10^80 is?  A terabyte is 10^12 bytes. So we
are talking about 10^68 1 Tb disk drives just to hold 1 byte records.
The number of grains of sand that would make up the volume of the
Earth is only about 10^32.  It's fair to say that at least for a few
decades there won't be enough storage capacity in the world to hold
10^80 data items.

If the interviewer actually said 10^8, we have 100 million records.
This is more reasonable.  The interviewer probably wanted you to ask
how big the records are.  If small, you can do a regular sort in
memory.  If large, you could sort a list of keys or disk offsets.
External sort would start being a valid technique with 10^9 or 10^10
records depending on how much main memory you have available.

As to tools, gnusort does external sort when the data are very big.
At least it did about 10 years ago when I looked at the source code.

On Jan 14, 1:09 pm, Abhishek Goswami <zeal.gosw...@gmail.com> wrote:
> Hi,
> Could any point out me any algorithm and program  if we need to sort to
> large data
> like 10 ^ 80 with memory constraint. Suppose you have minimum memory like 4
> MB.
>
> I am not sure that this algo discussed or not but i was not able to find in
> this group.
>
> Thanks
> Abhishek

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to algogeeks@googlegroups.com.
To unsubscribe from this group, send email to 
algogeeks+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to