Razen Al Harbi wrote:
Hi all,
I am writing an application in which I create a forked process to execute a
specific Map/Reduce job. The problem is that when I try to read the output
stream of the forked process I get nothing and when I execute the same job
manually it starts printing the output I am expecting. For clarification I will
go through the simple code snippet:
Process p = rt.exec("hadoop jar GraphClean args");
BufferedReader reader = new BufferedReader(new
InputStreamReader(p.getInputStream()));
String line = null;
check = true;
while(check){
line = reader.readLine();
if(line != null){// I know this will not finish it's only for testing.
System.out.println(line);
}
}
If I run this code nothing shows up. But if execute the command (hadoop jar
GraphClean args) from the command line it works fine. I am using hadoop 0.19.0.
Why not just invoke the Hadoop job submission calls yourself, no need to
exec anything?
Look at org.apache.hadoop.util.RunJar to see what you need to do.
Avoid calling RunJar.main() directly as
- it calls System.exit() when it wants to exit with an error
- it adds shutdown hooks
-steve