Here is an answer for your question in old mail archive: http://lucene.472066.n3.nabble.com/pipe-application-error-td650185.html
On 3/31/11 10:15 AM, "Adarsh Sharma" <adarsh.sha...@orkash.com> wrote: Any update on the below error. Please guide. Thanks & best Regards, Adarsh Sharma Adarsh Sharma wrote: > Dear all, > > Today I faced a problem while running a map-reduce job in C++. I am > not able to understand to find the reason of the below error : > > > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000000_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) > > attempt_201103301130_0011_m_000000_0: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000001_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) > > attempt_201103301130_0011_m_000001_0: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000002_0, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at org.apache.hadoop.mapred.Child.main(Child.java:170) > Caused by: java.io.EOFException > at java.io.DataInputStream.readByte(DataInputStream.java:250) > at > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) > at > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) > at > org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) > attempt_201103301130_0011_m_000002_1: Hadoop Pipes Exception: failed > to open at wordcount-nopipe.cc:82 in > WordCountReader::WordCountReader(HadoopPipes::MapContext&) > 11/03/30 12:09:15 INFO mapred.JobClient: Task Id : > attempt_201103301130_0011_m_000000_2, Status : FAILED > java.io.IOException: pipe child exception > at > org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) > at > org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:35 > > I tried to run *wordcount-nopipe.cc* program in > */home/hadoop/project/hadoop-0.20.2/src/examples/pipes/impl* directory. > > > make wordcount-nopipe > bin/hadoop fs -put wordcount-nopipe bin/wordcount-nopipe > bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D > hadoop.pipes.java.recordwriter=true -input gutenberg -output > gutenberg-out11 -program bin/wordcount-nopipe > > or > bin/hadoop pipes -D hadoop.pipes.java.recordreader=false -D > hadoop.pipes.java.recordwriter=false -input gutenberg -output > gutenberg-out11 -program bin/wordcount-nopipe > > but error remains the same. I attached my Makefile also. > Please have some comments on it. > > I am able to wun a simple wordcount.cpp program in Hadoop Cluster but > don't know why this program fails in Broken Pipe error. > > > > Thanks & best regards > Adarsh Sharma