I solved how to set different data directories on the same NFS $HOME
directories shared by all nodes in the cluster.
The HADOOP_OPTS value set in conf/hadoop-env.sh is the used to set custom
properties. All the values set to this variable will be set to JVM.
Just like we set java.net.preferIPv4Stac
Dear Jeff,
OK, I solved this!
You are right that these three computers are using NFS sharing a same $HOME.
My primary idea is put Hadoop and all its relevant data inside the $HOME,
but I failed.
So now I set the hadoop.tmp.dir and log directories to their own /tmp
directories.
Then it's done!
Th
Dear Jeff,
Yes, I think so.
I didn't ask them for detail, but it seems like they are sharing a NFS.
Regards
welman Lu
Does your cluster share a same NFS ?
On Fri, Mar 12, 2010 at 12:28 AM, Lu welman wrote:
> Dear Jeff,
>
> First, thank you very much for your selfless help!
> For these three computers, it just seems like they are sharing a same disk
> area. I don't know whether they will redundant copy data or o
Dear Jeff,
First, thank you very much for your selfless help!
For these three computers, it just seems like they are sharing a same disk
area. I don't know whether they will redundant copy data or other way.
Any way, let me use an example to explain that.
When I create any file, e.g., foo in one o
Sorry, but I still quite understand why you said about "When the HDFS
starts, then only one datanode can lock the directory, and the other two are
fail." As my understanding, the three computers are independently, the
failure of two data node should not have anything to do with lock, you need
look
Hi Jeff,
>From my viewpoint, I can't see the disks of these three computers. All I can
see is only one $HOME directory.
Whatever computer I log into, I can see the same contents inside this $HOME
directory.
I borrowed these three computers from a big cluster. And I only use ssh to
remote control
Hi Lu,
All the variable is in System.getProperties(),
and what do you mean "all three computers will set their data directory into
a same one."? The 3 computers are independent, so why they share the same
directory ?
On Thu, Mar 11, 2010 at 7:41 AM, Lu welman wrote:
> Hi, Jeff,
>
> I think I
Hi, Jeff,
I think I misunderstand what you said about.
I think you want to say is that I can set ${hostname} in *-site.xml,
then I can use the codes you mentioned to get the hostname in my own
program, right?
Sorry that I didn't make my question clear.
The problem of mine now, is that I am deplo
Hi Lu,
I assume you are implementing the Tool interface to run your mapreduce job.
Then put the code in the run method
@Override
public int run(String[] args) throws Exception {
JobConf conf=new JobConf();
conf.set("hostname", InetAddress.getLocalHost().getHostName());
Hi, Jeff.
Thank you very much for the reply.
Unfortunately, I don't where I can set the codes you mentioned.
Can you tell me more about that?
Thanks!
Regards
welman Lu
There's no such environment variable internally, But there's a work around.
Get the host name by using java api, and put the value into configuration,
just like this:
Configuration conf=new Configuration();
conf.set("hostname",InetAddress.getLocalHost().getHostName());
then you can use the ${host
Hi, all
I saw that in *-site.xml, we can use ${user.name} to get the username of
present user.
If I want to get the environment $HOSTNAME, what should I do?
I tried ${HOSTNAME}, ${env.hostname}, both of them can't work.
It just return the string of "${HOSTNAME}" and "${env.hostname}" themselves.
13 matches
Mail list logo