Dear all,

Today I faced a problem while running a map-reduce job in C++. I am not able to understand to find the reason of the below error :


11/03/30 12:09:02 INFO mapred.JobClient: Task Id : attempt_201103301130_0011_m_000000_0, Status : FAILED
java.io.IOException: pipe child exception
at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101)
       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
       at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.io.EOFException
       at java.io.DataInputStream.readByte(DataInputStream.java:250)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114)

attempt_201103301130_0011_m_000000_0: Hadoop Pipes Exception: failed to open at wordcount-nopipe.cc:82 in WordCountReader::WordCountReader(HadoopPipes::MapContext&) 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : attempt_201103301130_0011_m_000001_0, Status : FAILED
java.io.IOException: pipe child exception
at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101)
       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
       at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.io.EOFException
       at java.io.DataInputStream.readByte(DataInputStream.java:250)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114)

attempt_201103301130_0011_m_000001_0: Hadoop Pipes Exception: failed to open at wordcount-nopipe.cc:82 in WordCountReader::WordCountReader(HadoopPipes::MapContext&) 11/03/30 12:09:02 INFO mapred.JobClient: Task Id : attempt_201103301130_0011_m_000002_0, Status : FAILED
java.io.IOException: pipe child exception
at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101)
       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
       at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.io.EOFException
       at java.io.DataInputStream.readByte(DataInputStream.java:250)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114) attempt_201103301130_0011_m_000002_1: Hadoop Pipes Exception: failed to open at wordcount-nopipe.cc:82 in WordCountReader::WordCountReader(HadoopPipes::MapContext&) 11/03/30 12:09:15 INFO mapred.JobClient: Task Id : attempt_201103301130_0011_m_000000_2, Status : FAILED
java.io.IOException: pipe child exception
at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101)
       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:35

I tried to run *wordcount-nopipe.cc* program in */home/hadoop/project/hadoop-0.20.2/src/examples/pipes/impl* directory.


make  wordcount-nopipe
bin/hadoop fs -put wordcount-nopipe   bin/wordcount-nopipe
bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input gutenberg -output gutenberg-out11 -program bin/wordcount-nopipe or bin/hadoop pipes -D hadoop.pipes.java.recordreader=false -D hadoop.pipes.java.recordwriter=false -input gutenberg -output gutenberg-out11 -program bin/wordcount-nopipe

but error remains the same. I attached my Makefile also.
Please have some comments on it.

I am able to wun a simple wordcount.cpp program in Hadoop Cluster but don't know why this program fails in Broken Pipe error.



Thanks & best regards
Adarsh Sharma

------------------------------------------------------------------------

CC = g++
HADOOP_INSTALL =/home/hadoop/project/hadoop-0.20.2
PLATFORM = Linux-amd64-64 CPPFLAGS = -m64 -I/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/include -I/usr/local/cuda/include

wordcount-nopipe : wordcount-nopipe.cc
        $(CC) $(CPPFLAGS) $< -Wall 
-L/home/hadoop/project/hadoop-0.20.2/c++/Linux-amd64-64/lib 
-L/usr/local/cuda/lib64 -lhadooppipes \
        -lhadooputils -lpthread -g -O2 -o $@


Reply via email to