What happens if you try this:

$ hadoop fs -rmr HDFSPATH/output ; hadoop pipes -D 
hadoop.pipes.executable=EXECUTABLE -D hadoop.pipes.java.recordreader=true -D 
hadoop.pipes.java.recordwriter=true -input HDFSPATH/input -output 
HDFSPATH/output
> Deleted hdfs://mainclusternn.hipods.ihost.com/HDFSPATH/output

On Tue, 2010-03-30 at 15:05 -0700, Keith Wiley wrote:
> $ hadoop fs -rmr HDFSPATH/output ; hadoop pipes -D 
> hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true 
> -input HDFSPATH/input -output HDFSPATH/output -program HDFSPATH/EXECUTABLE
> Deleted hdfs://mainclusternn.hipods.ihost.com/HDFSPATH/output
> 10/03/30 14:56:55 WARN mapred.JobClient: No job jar file set.  User classes 
> may not be found. See JobConf(Class) or JobConf#setJar(String).
> 10/03/30 14:56:55 INFO mapred.FileInputFormat: Total input paths to process : 
> 1
> 10/03/30 14:57:05 INFO mapred.JobClient: Running job: job_201003241650_1076
> 10/03/30 14:57:06 INFO mapred.JobClient:  map 0% reduce 0%
> ^C
> $
> 
> At that point the terminal hung, so I eventually ctrl-Ced to break it.  Now 
> if I investigate the Hadoop task logs for the mapper, I see this:
> 
> stderr logs
> bash: 
> /data/disk2/hadoop/mapred/local/taskTracker/archive/mainclusternn.hipods.ihost.com/uwphysics/kwiley/mosaic/c++_bin/Mosaic/Mosaic:
>  cannot execute binary file
> 
> ...which makes perfect sense in light of the following:
> 
> $ hd fs -ls /uwphysics/kwiley/mosaic/c++_bin
> Found 1 items
> -rw-r--r--   1 kwiley uwphysics     211808 2010-03-30 10:26 
> /uwphysics/kwiley/mosaic/c++_bin/Mosaic
> $ hd fs -chmod 755 /uwphysics/kwiley/mosaic/c++_bin/Mosaic
> $ hd fs -ls /uwphysics/kwiley/mosaic/c++_bin
> Found 1 items
> -rw-r--r--   1 kwiley uwphysics     211808 2010-03-30 10:26 
> /uwphysics/kwiley/mosaic/c++_bin/Mosaic
> $
> 
> Note that this is all in attempt to run an executable that was uploaded to 
> HDFS in advance.  In this example I am not attempting to run an executable 
> stored on my local machine.  Any attempt to do that results in a file not 
> found error:
> 
> $ hadoop fs -rmr HDFSPATH/output ; hadoop pipes -D 
> hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true 
> -input HDFSPATH/input -output HDFSPATH/output -program LOCALPATH/EXECUTABLE
> Deleted hdfs://mainclusternn.hipods.ihost.com/uwphysics/kwiley/mosaic/output
> Exception in thread "main" java.io.FileNotFoundException: File does not 
> exist: /Users/kwiley/hadoop-0.20.1+152/Mosaic/clue/Mosaic/src/cpp/Mosaic
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:457)
>       at 
> org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)
>       at 
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:681)
>       at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:802)
>       at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:771)
>       at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1290)
>       at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248)
>       at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479)
>       at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494)
> $
> 
> It's clearly looking or the executable in HDFS, not on the local system, thus 
> the file not found error.
> 
> ________________________________________________________________________________
> Keith Wiley               kwi...@keithwiley.com               
> www.keithwiley.com
> 
> "What I primarily learned in grad school is how much I *don't* know.
> Consequently, I left grad school with a higher ignorance to knowledge ratio 
> than
> when I entered."
>   -- Keith Wiley
> ________________________________________________________________________________
> 
> 
> 
> 

Reply via email to