Re: Map/Recuce Job done locally?
Philipp, I have no problem running jobs locally with eclipse (via hadoop plugin) and observing it from browser. (Please note that jobtracker page doesn't refresh automatically, you need to refresh it manually.) Cheers, Rasit 2009/2/19 Philipp Dobrigkeit pdobrigk...@gmx.de When I start my job from eclipse it gets processed and the output is generated, but it never shows up in my JobTracker, which is opened in my browser. Why is this happening? -- Pt! Schon vom neuen GMX MultiMessenger gehört? Der kann`s mit allen: http://www.gmx.net/de/go/multimessenger01 -- M. Raşit ÖZDAŞ
Re: Map/Recuce Job done locally?
Hey Philipp! MR jobs are run locally if you just run the java file, to get it running in distributed mode you need to create a job jar and run that like ./bin/hadoop jar ... Regards Erik
Re: Map/Recuce Job done locally?
Hi Erik, thank you, this list is really quick and helpful. I will try that. Is it enough to just create a .jar file containing my Class and how would I start such a job from another program, when I want to track for example execution time? Do I run some kind of system.exec(path/hadoop jar MyClass Param)? Best, Philipp Original-Nachricht Datum: Thu, 19 Feb 2009 13:42:41 -0800 Von: Erik Holstad erikhols...@gmail.com An: core-user@hadoop.apache.org Betreff: Re: Map/Recuce Job done locally? Hey Philipp! MR jobs are run locally if you just run the java file, to get it running in distributed mode you need to create a job jar and run that like ./bin/hadoop jar ... Regards Erik -- Pt! Schon vom neuen GMX MultiMessenger gehört? Der kann`s mit allen: http://www.gmx.net/de/go/multimessenger01
Re: Map/Recuce Job done locally?
Hey Philipp! Not sure about your time tracking thing, probably works, I've just used a bash script to start the jar and then you can do the timing in the script. About how to compile the jars, you need to include the dependencies too, but you will see what you are missing when you run the job. Regards Erik