1) when running under windows, include the cygwin bin directory in your
windows path environment variable
2) eclipse is not so good at submitting supporting jar files, in your
application lauch path add a -libjars path/hadoop-<rel>-examples.jar.

On Fri, May 8, 2009 at 10:13 AM, georgep <p09...@gmail.com> wrote:

>
> When run as a java application, the trace:
>
> 09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
> javax.security.auth.login.LoginException: Login failed: Cannot run program
> "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
>        at
>
> org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
>        at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
>        at
> org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:354)
>        at
>
> org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:377)
>        at mapreduce.WordCount.main(WordCount.java:44)
> 09/05/08 10:08:49 WARN fs.FileSystem: uri=file:///
> javax.security.auth.login.LoginException: Login failed: Cannot run program
> "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:257)
>        at
>
> org.apache.hadoop.security.UserGroupInformation.login(UserGroupInformation.java:67)
>        at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1410)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1348)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:213)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:118)
>        at
> org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:311)
>        at org.apache.hadoop.mapred.JobClient.init(JobClient.java:390)
>        at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:361)
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1011)
>        at mapreduce.WordCount.main(WordCount.java:46)
> 09/05/08 10:08:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> Problem
> 09/05/08 10:08:50 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> java.io.IOException: Failed to get the current user's information.
>        at
>
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:559)
>        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:729)
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1026)
>        at mapreduce.WordCount.main(WordCount.java:46)
> Caused by: javax.security.auth.login.LoginException: Login failed: Cannot
> run program "whoami": CreateProcess error=2, ¨t²??¨ì«ü©
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:250)
>        at
>
> org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupInformation.java:275)
>        at
>
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:557)
>        ... 3 more
>
> When run on hadoop server:
>
> Trace:
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> mapreduce.test.WordCount
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
>        at java.lang.Class.forName0(Native Method)
>        at java.lang.Class.forName(Class.java:247)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
>
> Thanks!:-):-)
>
>
>
> georgep wrote:
> >
> > Trace:
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > mapreduce.test.WordCount
> >       at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >       at java.lang.Class.forName0(Native Method)
> >       at java.lang.Class.forName(Class.java:247)
> >       at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
> >
> > Code:
> > public class WordCount{
> >
> >           public static void main(String[] args) throws Exception {
> >               try {
> >                               JobConf conf = new
> JobConf(WordCount.class);
> >                                 conf.setJobName("wordcount");
> >
> >                                 conf.setOutputKeyClass(Text.class);
> >
> conf.setOutputValueClass(IntWritable.class);
> >
> >      conf.setMapperClass(Map.class);
> >                                 conf.setCombinerClass(Reduce.class);
> >      conf.setReducerClass(Reduce.class);
> >
> >
> conf.setInputFormat(TextInputFormat.class);
> >
> conf.setOutputFormat(TextOutputFormat.class);
> >
> >      FileInputFormat.setInputPaths(conf, new Path("input"));
> >      FileOutputFormat.setOutputPath(conf, new Path("output"));
> >
> >      JobClient.runJob(conf);
> >                       }
> >                       catch (Exception t) {
> >                               // TODO Auto-generated catch block
> >                               t.printStackTrace();
> >                               System.out.println("Problem");
> >                       }
> >           }
> >
> > }
> >
> >
> > Thank you!
> >
> >
> >
> > TimRobertson100 wrote:
> >>
> >> Can you post the entire error trace please?
> >>
> >> On Fri, May 8, 2009 at 9:40 AM, George Pang <p09...@gmail.com> wrote:
> >>> Dear  users,
> >>> I got "ClassNotFoundException" when run the WordCount example on hadoop
> >>> using Eclipse.  Does anyone know where is the problem?
> >>>
> >>> Thank you!
> >>>
> >>> George
> >>>
> >>
> >>
> >
> >
>
> --
> View this message in context:
> http://www.nabble.com/ClassNotFoundException-tp23441528p23449910.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>


-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422
www.prohadoopbook.com a community for Hadoop Professionals

Reply via email to