Hi All,
I am just started Learning fundamentals of HDFS and its internal
mechanism , concepts used here are very impressive and looks simple but
makes me confusing and my question is *who will responsible for handling
DFS write failure in pipe line (assume replication factor is 3 and 2nd DN
fail
;utm_content=001>
>
> These 2-part blog posts from Yongjun should help you understand the HDFS
> file write recovery process better:
> http://blog.cloudera.com/blog/2015/02/understanding-hdfs-recovery-processes-part-1/
> and
> http://blog.cloudera.com/blog/2015/03/understanding-hdfs-rec
I believe any third party jars needs to be part of lib folder under hadoop
installed path.
On Tue, Sep 22, 2015 at 5:55 PM, xeonmailinglist
wrote:
> Hi,
>
> I want to execute `hadoop jar myexample.jar` but I want to pass spring
> framework jars. How can I add spring framework jars to hadoop clas