hey
I am working with the EC2 environment.
I registered and am being billed for EC2 and S3.
Right now I have two cygwin windows open.
1 is as an administrator-server(on which sshd running) in which I have
a separate folder for hadoop files
and am able to do bin/hadoop
1 as  a normal user-client.
In the client there is no hadoop folder and hence I cant run bin/hadoop.
>From here I do not know how to proceed?
I basically want to implement
http://developer.amazonwebservices.com/connect/entry.jspa?externalID=873.
Hence I created a host using dyndns.
If you can help me,it will be great.

On Tue, Apr 15, 2008 at 2:15 PM, Norbert Burger
<[EMAIL PROTECTED]> wrote:
> Are you trying to run Hadoop on a local cluster, or in the EC2 environment?
>  If EC2, then your MASTER_HOST setting is wrong, becuase it points to a
>  residential ISP (*.rr.com).  It should instead point to your jobtracker node
>  (the first node started by the EC2 scripts).
>
>  The startup scripts for a standard Hadoop cluster and an EC2-based Hadoop
>  cluster are different.  If you're working with a local cluster, try
>  referencing Michael Noll's articles instead:
>
>  http://wiki.apache.org/hadoop/HadoopArticles
>
>
>
>  On Tue, Apr 15, 2008 at 2:00 PM, Prerna Manaktala <
>  [EMAIL PROTECTED]> wrote:
>
>  > I tried to set up hadoop with cygwin according to the
>  >  paper:
>  > http://developer.amazonwebservices.com/connect/entry.jspa?externalID=873
>  >  But I had problems working with dyndns.I created a new host
>  >  there:prerna.dyndns.org
>  >  and gave the ip address of it in hadoop-ec2-env.sh as a value of
>  > MASTER_HOST.
>  >  But when I do bin/hadoop-ec2 start-hadoop the error comes:
>  >  ssh:connect to host prerna.dyndns.org,port 22:connection refused
>  >  ssh failed for [EMAIL PROTECTED]
>  >  Also a warning comes that:id_rsa_gsg-keypair not accesible:No such
>  >  file or directory though there is this file.
>  >
>  >  Thanks
>  >  Prerna
>  >
>

Reply via email to