Hi,

Is that command available for all nodes? Did you try as below? ;)

Process proc = rt.exec("/bin/hostname");
..
output.collect(hostname, disk usage);

On Tue, Apr 28, 2009 at 6:13 PM, Razen Al Harbi <razen.alha...@yahoo.com> wrote:
> Hi all,
>
> I am writing an application in which I create a forked process to execute a 
> specific Map/Reduce job. The problem is that when I try to read the output 
> stream of the forked process I get nothing and when I execute the same job 
> manually it starts printing the output I am expecting. For clarification I 
> will go through the simple code snippet:
>
>
> Process p = rt.exec("hadoop jar GraphClean args");
> BufferedReader reader = new BufferedReader(new 
> InputStreamReader(p.getInputStream()));
> String line = null;
> check = true;
> while(check){
>     line = reader.readLine();
>     if(line != null){// I know this will not finish it's only for testing.
>         System.out.println(line);
>     }
> }
>
> If I run this code nothing shows up. But if execute the command (hadoop jar 
> GraphClean args) from the command line it works fine. I am using hadoop 
> 0.19.0.
>
> Thanks,
>
> Razen
>
>
>



-- 
Best Regards, Edward J. Yoon @ NHN, corp.
edwardy...@apache.org
http://blog.udanax.org

Reply via email to