I tried HadoopDfsReadWriteExample. I am getting the following error. I
appreciate any help. I provide more info at the end.


Error while copying file
Exception in thread "main" java.io.IOException: Cannot run program
"df": CreateProcess error=2, The system cannot find the file specified
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
        at java.lang.Runtime.exec(Runtime.java:593)
        at java.lang.Runtime.exec(Runtime.java:466)
        at org.apache.hadoop.fs.ShellCommand.runCommand(ShellCommand.java:48)
        at org.apache.hadoop.fs.ShellCommand.run(ShellCommand.java:42)
        at org.apache.hadoop.fs.DF.getAvailable(DF.java:72)
        at 
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:296)
        at 
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:326)
        at 
org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:155)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.newBackupFile(DFSClient.java:1483)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.openBackupStream(DFSClient.java:1450)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.writeChunk(DFSClient.java:1592)
        at 
org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunk(FSOutputSummer.java:140)
        at 
org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:122)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.close(DFSClient.java:1728)
        at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:49)
        at 
org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:64)
        at HadoopDFSFileReadWrite.main(HadoopDFSFileReadWrite.java:106)
Caused by: java.io.IOException: CreateProcess error=2, The system
cannot find the file specified
        at java.lang.ProcessImpl.create(Native Method)
        at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)
        at java.lang.ProcessImpl.start(ProcessImpl.java:30)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
        ... 17 more



Note: I am on a Windows machine. The namenode is running in the same
Windows machine. The way I initialized the configuration is:

    Configuration conf = new Configuration();
    conf.addResource(new
Path("C:\\cygwin\\hadoop-management\\hadoop-conf\\hadoop-site.xml"));
    FileSystem fs = FileSystem.get(conf);


Any suggestions?

Cagdas

Reply via email to