Hi, Menno and Aseem. Thank you for your help! With your help, I can
now use ssh to connect each node without providing the username.

However, another problem occurs. The directory structures are
different among the servers, so when I use the starting script
"start-all.sh" to start hadoop, it seems to run the bash using the
structure exactly to the namenode.

For example, the namenode(in server0) structure is like
"/home/user0/hadoop/...", slave1(in server1) is like
"/home/s/user1/hadoop/...", and slave2(in server2) is like
"/home/u/user2/proj/hadoop/...". When I run start-all.sh, this message
is shown:

"

  starting namenode, logging to /home/user0/hadoop/bin/...
  server1: bash: line 0: cd: /home/user0/hadoop/bin/..: No such file
or directory
  server1: bash: /home/user0/hadoop/bin/hadoop-daemon.sh: No such file
or directory
  server2: bash: line 0: cd: /home/user0/hadoop/bin/..: No such file
or directory
  server2: bash: /home/user0/hadoop/bin/hadoop-daemon.sh: No such file
or directory
  ......

"
It seemed all the nodes should have the same structure to run hadoop.
But I have no admin privillege on these servers, so I cannot create
the exact directory as the namenode. Is there any way to let the nodes
with different structures to run it? How can I configure it?

Again, thank you all for your kind help!!!

Starry

/* Tomorrow is another day. So is today. */



On Mon, May 4, 2009 at 22:09, Puri, Aseem <aseem.p...@honeywell.com> wrote:
> Starry,
>
>        In ".ssh" directory you have to create a file "config" (without
> extension) on every node.
>
> Suppose server1 is your master and server2, server3 is your slave.
>
> On the master (server1), in the "config" file and add the following
> lines:
>
> Host server2
> User user2
> Host server3
> User user3
>
> On both slave (server2, server3) nodes, in the "config" file and add the
> following lines:
>
> Host server1
> User user1
>
> Hope it works for you
>
> Regards
> Aseem Puri
>
>
> -----Original Message-----
> From: Menno Luiten [mailto:mlui...@artifix.net]
> Sent: Monday, May 04, 2009 7:27 PM
> To: core-user@hadoop.apache.org
> Subject: RE: How to configure nodes with different user account?
>
> Hi Starry,
>
> What is the content of your 'slaves' file in the hadoop/conf directory
> of your master node?
> It should say something like:
>
> localhost
> us...@server2
> us...@server3
> us...@server4
>
> This should let the start-up scripts try and login using the proper
> users.
>
> Hope that helps,
> Menno
>
> -----Oorspronkelijk bericht-----
> Van: Starry SHI [mailto:starr...@gmail.com]
> Verzonden: maandag 4 mei 2009 10:53
> Aan: core-user@hadoop.apache.org
> Onderwerp: How to configure nodes with different user account?
>
> Hi, all. I am new to Hadoop and I have a question to ask~
>
> I have several accounts located in different linux servers (normal
> user privilege, no admin authority), and i want to use them to form a
> small cluster to run Hadoop applications. However, the usernames for
> these accounts are different. I want to use shared key to connect all
> the nodes, but I failed after several attempts. Is it possible to
> connect all of them via different account?
>
> For example, I have 3 account: us...@server1, us...@server2,
> us...@server3. After assigning authorized keys, I can use "ssh
> us...@server2" without input the password. But when I start hadoop, I
> was asked to input the password for us...@server2 (when I have already
> logged in as user1).
>
> Can my problem be solved easily? I wish to get your help soon.
>
> Thank you for all your attention and help!
>
> Best regards,
> Starry
>
>

Reply via email to