modify writing policy to HDFS

2013-11-02 Thread Karim Awara
Hi, I understand the way file upload happens on HDFS, where the data node asks the namenode for a pipe (64 MB) for writing chunks of the file to hdfs. I want to change the source code of HDFS such that the datanode can have multiple pipes opens in parallel, where i push the data to the pipe based

Re: Implementing a custom hadoop key and value - need Help

2013-11-02 Thread Amr Shahin
Can you share the code? sent from mobile On Nov 1, 2013 7:06 AM, "unmesha sreeveni" wrote: > > thanks Steve Loughran and Amr Shahin > Amr Shahin , i refered " > http://my.safaribooksonline.com/book/databases/hadoop/9780596521974/serialization/id3548156"; > the same thing only. but my toString is

Re: Path exception when running from inside IDE.

2013-11-02 Thread Vinayakumar B
In your eclipse classpath core-site.xml is there? Directory which contains site xmls should be there in classpath. Not directly xml files. Make sure fs.defaultFS points to correct hdfs path Regards, Vinayakumar B On Nov 2, 2013 5:21 PM, "Harsh J" wrote: > Your job configuration isn't picking up

Re: Path exception when running from inside IDE.

2013-11-02 Thread Harsh J
Your job configuration isn't picking up or passing the right default filesystem (fs.default.name or fs.defaultFS) before submitting the job. As a result, the non-configured default of local filesystem is getting picked up for paths you intended to look for on HDFS. On Friday, November 1, 2013, Oma