I have an IntelliJ project file that I could share, if you like.
For my own applications, I use IntelliJ's IDEA to read the software (because it understands the code well) and run test cases, but I just use the ant targets to build hadoop.jar. That isn't nearly as responsive as building in IDEA, but it definitely suffices. On 7/23/07 6:50 AM, "Jeroen Verhagen" <[EMAIL PROTECTED]> wrote: > Hi all, > > I'm trying to get Hadoop 0.13 running from Intellij so I can test my > own Map and Reduce classes on the local job runner. To do that I'm > first trying to get the WordCount sample to work and I'm already > running into trouble: > > I copied the WordCount class into my own package from my own project, > created a directory 'in' containing the file from which I want to > count the words. I copied the 'conf' dir from the hadoop dir that I > know contains a configuration that allows me to run the WordCount > sample. > > This is the output from my console: > > ****************************** > C:\java\jdk1.5.0\bin\java > -Xmx1000m > -Dhadoop.log.dir=logs > -Dhadoop.log.file=hadoop.log > -Dhadoop.home.dir=. > -Dhadoop.id.str= > -Dhadoop.root.logger=DEBUG,console > -Didea.launcher.port=7534 > "-Didea.launcher.bin.path=C:\java\IntelliJ IDEA 5.1\bin" > -Dfile.encoding=windows-1252 > -classpath " > C:\java\jdk1.5.0\jre\lib\charsets.jar; > C:\java\jdk1.5.0\jre\lib\deploy.jar; > C:\java\jdk1.5.0\jre\lib\javaws.jar; > C:\java\jdk1.5.0\jre\lib\jce.jar; > C:\java\jdk1.5.0\jre\lib\jsse.jar; > C:\java\jdk1.5.0\jre\lib\plugin.jar; > C:\java\jdk1.5.0\jre\lib\rt.jar; > C:\java\jdk1.5.0\jre\lib\ext\dnsns.jar; > C:\java\jdk1.5.0\jre\lib\ext\localedata.jar; > C:\java\jdk1.5.0\jre\lib\ext\sunjce_provider.jar; > C:\java\jdk1.5.0\jre\lib\ext\sunpkcs11.jar; > C:\mycomp\myproject\core\target\test-classes; > C:\mycomp\myproject\core\target\classes; > C:\mycomp\myproject\core\lib\hadoop-0.13.0-core.jar; > C:\java\IntelliJ IDEA 5.1\lib\idea_rt.jar" > com.intellij.rt.execution.application.AppMain nl.mycomp.TestHadoop in > target/out > > Exception in thread "main" java.io.IOException: Job failed! > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604) > at nl.mycomp.TestHadoop.main(TestHadoop.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j > ava:25) > at java.lang.reflect.Method.invoke(Method.java:585) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:90) > > Process finished with exit code 1 > ****************************** > > And this is the content of the log4j log file: > > ****************************** > DEBUG 15:36:11,813 main org.apache.hadoop.conf.Configuration - > java.io.IOException: config() > at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:93) > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:66) > at nl.mycomp.TestHadoop.main(TestHadoop.java:61) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j > ava:25) > at java.lang.reflect.Method.invoke(Method.java:585) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:90) > > DEBUG 15:36:12,313 main org.apache.hadoop.mapred.JobClient - default > FileSystem: file:/// > INFO 15:36:12,344 main org.apache.hadoop.mapred.FileInputFormat - > Total input paths to process : 1 > DEBUG 15:36:12,344 main org.apache.hadoop.mapred.JobClient - Creating > splits at file:/tmp/hadoop-jeroen/mapred/system/submit_em16mi/job.split > DEBUG 15:36:12,360 main org.apache.hadoop.mapred.FileInputFormat - > Total # of splits: 2 > DEBUG 15:36:12,579 main org.apache.hadoop.conf.Configuration - > java.io.IOException: config(config) > at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:103) > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:77) > at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:83) > at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:260) > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:376) > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:543) > at nl.mycomp.TestHadoop.main(TestHadoop.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j > ava:25) > at java.lang.reflect.Method.invoke(Method.java:585) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:90) > > DEBUG 15:36:12,657 main org.apache.hadoop.conf.Configuration - > java.io.IOException: config() > at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:93) > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:107) > at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:87) > at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:260) > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:376) > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:543) > at nl.mycomp.TestHadoop.main(TestHadoop.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j > ava:25) > at java.lang.reflect.Method.invoke(Method.java:585) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:90) > > INFO 15:36:12,719 main org.apache.hadoop.mapred.JobClient - Running > job: job_8wasq7 > DEBUG 15:36:12,719 Thread-0 org.apache.hadoop.mapred.FileInputFormat - > Total # of splits: 1 > DEBUG 15:36:12,735 Thread-0 org.apache.hadoop.conf.Configuration - > java.io.IOException: config(config) > at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:103) > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:77) > at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:126) > > DEBUG 15:36:12,782 Thread-0 org.apache.hadoop.mapred.MapTask - Writing > local split to /tmp/hadoop-jeroen/mapred/system/submit_em16mi/split.dta > INFO 15:36:12,782 Thread-0 org.apache.hadoop.mapred.MapTask - numReduceTasks: > 1 > INFO 15:36:13,719 main org.apache.hadoop.mapred.JobClient - map 0% reduce 0% > INFO 15:36:13,751 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > INFO 15:36:14,735 main org.apache.hadoop.mapred.JobClient - map 9% reduce 0% > INFO 15:36:14,766 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > INFO 15:36:15,735 main org.apache.hadoop.mapred.JobClient - map 23% reduce 0% > INFO 15:36:15,969 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > INFO 15:36:16,735 main org.apache.hadoop.mapred.JobClient - map 36% reduce 0% > INFO 15:36:17,063 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > INFO 15:36:17,735 main org.apache.hadoop.mapred.JobClient - map 54% reduce 0% > INFO 15:36:18,079 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > INFO 15:36:18,798 main org.apache.hadoop.mapred.JobClient - map 73% reduce 0% > INFO 15:36:19,094 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > file:/C:/mycomp/myproject/core/in/kjv12.txt:0+4834757 > DEBUG 15:36:19,751 Sort progress reporter for task map_0000 > org.apache.hadoop.mapred.MapTask - Started thread: Sort progress > reporter for task map_0000 > WARN 15:36:19,766 Thread-0 org.apache.hadoop.mapred.LocalJobRunner - > job_8wasq7 > java.io.IOException: CreateProcess: df -k > C:\tmp\hadoop-jeroen\mapred\local error=2 > at java.lang.ProcessImpl.create(Native Method) > at java.lang.ProcessImpl.<init>(ProcessImpl.java:81) > at java.lang.ProcessImpl.start(ProcessImpl.java:30) > at java.lang.ProcessBuilder.start(ProcessBuilder.java:451) > at java.lang.Runtime.exec(Runtime.java:591) > at java.lang.Runtime.exec(Runtime.java:464) > at org.apache.hadoop.fs.DF.doDF(DF.java:60) > at org.apache.hadoop.fs.DF.<init>(DF.java:53) > at > org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDi > rAllocator.java:181) > at > org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrit > e(LocalDirAllocator.java:218) > at > org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator. > java:124) > at > org.apache.hadoop.mapred.MapOutputFile.getSpillFileForWrite(MapOutputFile.java > :88) > at > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpillToDisk(MapTask.ja > va:405) > at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:620) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:187) > at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:131) > ****************************** > > Could anyone please help me and tell me what's causing this problem? > > On a sidenote: Why is hadoop creating tmp files in C:\tmp? I did not > configure C:\tmp to be my tempdir, that's > C:\DOCUME~1\jeroen\LOCALS~1\Temp which is set through an environment > variable. > > Thanks and regards, > > Jeroen