Hey,
Nah, just cygwin and compiling with command line is asking for trouble.
Eclipse is better to handle classpaths : )
I seem to have managed to get things working with this guide:
http://ebiquity.umbc.edu/Tutorials/Hadoop/00%20-%20Intro.html
With this I managed to set up a cluster, with the difference that in the
final part when creating the map/reduce driver I replaced the code with the
current example map / reduce code, replacing arguments with strings and
creating approriate dfs locations in localhost. So on windows i recommend
that.
I do have some new questions tho as
1) There doesn't seem to be an eclipse-plugin in hadoop 0.20.1? No longer
supported or something else? Has someone tested with the old plugin?
2) I'm currently using 0.19.2 as this had the eclipse plugin, can you run
code from 0.19.2 on 0.20.1 nodes or need refactoring?
- Mikko
----- Original Message -----
From: "Todd Lipcon" <t...@cloudera.com>
To: "Mikko Lahti" <mikko.la...@pp1.inet.fi>
Cc: <common-user@hadoop.apache.org>
Sent: Thursday, December 03, 2009 5:33 PM
Subject: Re: Trouble with tutorial
It looks like org.myorg.Wordcount isn't compiled into that jar.
I'm surmising that you might be pretty new to Java. If so, you may want to
start with Hadoop using Hadoop Streaming, which lets you write jobs using
various scripting languages.
Thanks
-Todd
On Thu, Dec 3, 2009 at 4:31 AM, Mikko Lahti <mikko.la...@pp1.inet.fi>
wrote:
Hey,
This was indeed the case (javac was wrong version), but now i ran into
another problem.
When running the program with the line:
bin/hadoop jar usr/joe/wordcount.jar org.myorg.WordCount
usr/joe/wordcount/input usr/joe/wordcount/output
the result is:
Exception in thread "main" java.lang.ClassNotFoundException:
org.myorg.WordCount
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:303)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:316)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
I'm guessing this is a classpath issue, so I tried adding the path where
the .jar file is to the system env classpath variable, but this didn't
help. I also overheard that there should be a temporary classes dir in
hadoop home, i created this and added to classpath, but this didn't work
either.
Anyone seen something similar / possible fixes?
Thanks again
- Mikko
----- Original Message -----
*From:* Todd Lipcon <t...@cloudera.com>
*To:* common-user@hadoop.apache.org ; Mikko
Lahti<mikko.la...@pp1.inet.fi>
*Sent:* Wednesday, December 02, 2009 9:03 PM
*Subject:* Re: Trouble with tutorial
Make sure you're using Java6. Hadoop is compiled with Java6, and it
smells
like you're using an earlier JDK.
-Todd
On Wed, Dec 2, 2009 at 11:01 AM, Mikko Lahti
<mikko.la...@pp1.inet.fi>wrote:
Hi,
I'm trying to run the map/reduce tutorials with windows XP and cygwin.
In the 2nd step: javac -classpath
${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes
WordCount.java , I get the following error:
WordCount.java:6: cannot access org.apache.hadoop.fs.Path
bad class file: hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class)
class file has wrong version 50.0, should be 49.0
Please remove or make sure it appears in the correct subdirectory of the
classpath.
import org.apache.hadoop.fs.Path;
^
1 error
I'm using the command:
javac -classpath hadoop-0.19.2-core.jar -d wordcount_classes
Wordcount.java
This is done at the root directory of Hadoop.
I also used the command with -verbose, this gets:
[parsing started WordCount.java]
[parsing completed 47ms]
[search path for source files: [hadoop-0.19.2-core.jar]]
[search path for class files: [c:\Program
Files\Java\jdk1.5.0_09\jre\lib\rt.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\jsse.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\jce.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\charsets.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\dnsns.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\jmf.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\localedata.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\sound.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\sunjce_provider.jar, c:\Program
Files\Java\jdk1.5.0_09\jre\lib\ext\sunpkcs11.jar,
hadoop-0.19.2-core.jar]]
[loading c:\Program
Files\Java\jdk1.5.0_09\jre\lib\rt.jar(java/io/IOException.class)]
[loading hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class)]
WordCount.java:6: cannot access org.apache.hadoop.fs.Path
bad class file: hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class)
class file has wrong version 50.0, should be 49.0
Please remove or make sure it appears in the correct subdirectory of the
classpath.
import org.apache.hadoop.fs.Path;
^
[total 453ms]
1 error
Running the latest stable 0.20.1 makes no difference.
What could be wrong?
- Mikko