Hey brock do you have a proper code its like giving a lot of errors!!
On Thu, Oct 13, 2011 at 4:29 PM, Brock Noland wrote:
> Hi,
>
> The code is very similar, just create a SequenceFile reader.
>
> Brock
>
> On Thu, Oct 13, 2011 at 4:53 AM, visioner sadak
>
I think yahoo is working on next gen map reduce if u have seen these links
below
http://developer.yahoo.com/blogs/hadoop/posts/2011/02/mapreduce-nextgen/
http://developer.yahoo.com/blogs/hadoop/posts/2011/03/mapreduce-nextgen-scheduler/
http://developer.yahoo.com/blogs/hadoop/posts/2011/02/capac
and
> those files can be split up, they can be split over multiple key value
> pairs.
>
> Brock
>
> On Wed, Oct 12, 2011 at 4:50 PM, visioner sadak
> wrote:
>
>> Hello guys,
>>
>> Thanks a lot again for your previous guidance guys,i tried out
>
in to one big one but dunt know how to split and
retrieve the small files again while reading files,,, as well..
Thanks and Gratitude
On Wed, Oct 5, 2011 at 1:27 AM, visioner sadak wrote:
> Thanks a lot wellington and bejoy for your inputs will try out this api and
> sequence file
>
&
> when writing to an OutputStream. It will be replicated in your
> cluster's HDFS then.
>
> Cheers.
>
> 2011/10/4 visioner sadak :
> > Hey thanks wellington just a thought will my data be replicated as well
> coz
> > i thought tht mapper does the job of breaking
che/hadoop/fs/FileSystem.html
> )
> and Path (
> http://hadoop.apache.org/common/docs/current/api/index.html?org/apache/hadoop/fs/Path.html
> )
> classes for that.
>
> Cheers,
> Wellington.
>
> 2011/10/4 visioner sadak :
> > Hello guys,
> >
> > I
Hello guys,
I would like to know how to do file uploads in HDFS using
java,is it to be done using map reduce what if i have a large number of
small files should i use sequence file along with map reduce???,It will be
great if you can provide some sort of information...