Thank you, Jason. I found the example. So, is there a way to share the same JVM
between different jobs?
From: jason hadoop jason.had...@gmail.com
To: core-user@hadoop.apache.org
Sent: Tuesday, June 16, 2009 7:22:16 PM
Subject: Re: Can I share datas for several
Zhang,
You will need cygwin. There is also a hadoop virtual machine that you
can use.
Check this tutorials for more details:
http://public.yahoo.com/gogate/hadoop-tutorial/html/module3.html
zjffdu wrote:
I found it can only work on linux, not windows.
So is there any way I can run it on
on my machine instead of
using the hadoop virtual machine.
Iman.
John Livingstone wrote:
Iman-4,
I have encountered the same problem that you have encountered: Not being
able to access HDFS on my Hadoop VMware Linux server (uning the Hadoop Yahoo
tutorial) and not seeing hadoop.job.ugi in my Eclipse
in the Map/Reduce location's
configuration.
Iman.
P.S. I sent this reply to the wrong thread before.
Erik Holstad wrote:
Thanks guys!
Running Linux and the remote cluster is also Linux.
I have the properties set up like that already on my remote cluster, but
not sure where to input this info
it in the list of parameters.
Thanks
Iman
Thank you so much, Norbert. It worked.
Iman
Norbert Burger wrote:
Are running Eclipse on Windows? If so, be aware that you need to spawn
Eclipse from within Cygwin in order to access HDFS. It seems that the
plugin uses whoami to get info about the active user. This thread has
some more info