HI My Friends:

         I’ has researched into hadoop and map/reduce, but before I can go on. 
I have one question, and I Can’t find it in FAQ. Please consider this situation:

1.       I created 100 files, each file of course is bigger than the default 
64MB(such as 1G),so definitely will be split into many blocks in hdfs

2.       When handling each file, I want to consider the file is 
none-splittable, due to that when handle part of file I need all of the other 
parts

So my question is this:

If I return “false”  in the function of “isSplitable” to tell the framework, 
it’s a none-splittable file, when doing map/reduce, how many map task may I 
have?Does I have 100 map, and each one handle a file? Or do you have any 
suggestions on how to handle this kind of situation?

Thanks!

Best wishes,

Yours 

Wang LiChuan

2011-2-14

Reply via email to