I tried running 0.20.0 on XP too a few weeks ago and stuck at the same
spot. No problems with standalone mode. Any insight would be
appreciated, thanks.

Ed

On Wed, Jan 27, 2010 at 11:41 AM, Yura Taras <yura.ta...@gmail.com> wrote:
> Hi all
> I'm trying to deploy pseudo-distributed cluster on my devbox which
> runs under WinXP. I did following steps:
> 1. Installed cygwin with ssh, configured ssh
> 2. Downloaded hadoop and extracted it, set JAVA_HOME and HADOOP_HOME
> env vars (I made a symlink to java home, so it don't contain spaces)
> 3. Adjusted conf/hadoop-env.sh to point to correct JAVA_HOME
> 4. Adjusted conf files to following values:
>   * core-site.xml:
> <configuration>
>    <property>
>      <name>hadoop.tmp.dir</name>
>      <value>/hdfs/hadoop</value>
>      <description>A base for other temporary directories.</description>
>    </property>
>    <property>
>        <name>fs.default.name</name>
>        <value>hdfs://localhost:8888</value>
>    </property>
> </configuration>
>
>  * hdfs-site.xml:
> <configuration>
>  <property>
>    <name>dfs.replication</name>
>    <value>1</value>
>  </property>
> </configuration>
>
>  * mapred-site.xml:
> <configuration>
>  <property>
>    <name>mapred.job.tracker</name>
>    <value>localhost:9999</value>
>  </property>
> </configuration>
>
> 5. Next I execute following line: $ bin/hadoop namenode -format &&
> bin/start-all.sh && bin/hadoop fs -put conf input && bin/hadoop jar
> hadoop-0.20.1-examples.jar grep input output 'dfs[a-z.]'
>
> I receive following exception:
> localhost: starting tasktracker, logging to
> /home/ytaras/hadoop/bin/../logs/hadoop-ytaras-tasktracker-bueno.out
> 10/01/27 18:23:55 INFO mapred.FileInputFormat: Total input paths to process : 
> 13
> 10/01/27 18:23:56 INFO mapred.JobClient: Running job: job_201001271823_0001
> 10/01/27 18:23:57 INFO mapred.JobClient:  map 0% reduce 0%
> 10/01/27 18:24:09 INFO mapred.JobClient: Task Id :
> attempt_201001271823_0001_m_000014_0, Status : FAILED
> java.io.FileNotFoundException: File
> D:/hdfs/hadoop/mapred/local/taskTracker/jobcache/job_201001271823_0001/attempt_201001271823_0001_m_000014_0/work/tmp
> does not exist.
>        at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
>        at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
>        at 
> org.apache.hadoop.mapred.TaskRunner.setupWorkDir(TaskRunner.java:519)
>        at org.apache.hadoop.mapred.Child.main(Child.java:155)
>
> !! SKIP - above exception few times !!
>
> 10/01/27 18:24:51 INFO mapred.JobClient: Job complete: job_201001271823_0001
> 10/01/27 18:24:51 INFO mapred.JobClient: Counters: 0
> java.io.IOException: Job failed!
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
>        at org.apache.hadoop.examples.Grep.run(Grep.java:69)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.examples.Grep.main(Grep.java:93)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
>
>
> Am I doing something wrong (don't say just 'use Linux' :-) )?
> Thanks
>

Reply via email to