you only need the contrib/ec2 scripts from 0.17. you don't need Hadoop 0.17.0.

just checkout the scripts and use them with whatever version of Hadoop you are most comfortable with (the version that works with HBase I expect).

On May 7, 2008, at 12:45 PM, Jim R. Wilson wrote:

Cool cool - thanks again Chris.

I'm thinking I should use hadoop-0.17 instead of 0.16.3 at this time
because it appears 0.17 has better support for ec2 (less
configuration, no dyndns necessary etc).

Is there a public directory somewhere which houses nightly branch
builds? or do I need to build 0.17 myself, then post it somewhere
(like s3) and have the script access that?

-- Jim

On Wed, May 7, 2008 at 2:27 PM, Chris K Wensel <[EMAIL PROTECTED]> wrote:
you do need the whole ec2 tree for the scripts to work...



On May 7, 2008, at 12:25 PM, Jim R. Wilson wrote:


Nevermind, looks like I needed these:
./src/contrib/ec2/bin/image/create-hadoop-image-remote
./src/contrib/ec2/bin/create-hadoop-image

-- Jim

On Wed, May 7, 2008 at 2:23 PM, Jim R. Wilson <[EMAIL PROTECTED] >
wrote:

Thanks Chris,

Where do I get this supposed "image/create-hadoop-remote" script? I couldn't `find` it anywhere within the hadoop svn tree, and the link
in the hadoop wiki is broken :/

-- Jim



On Wed, May 7, 2008 at 2:04 PM, Chris K Wensel <[EMAIL PROTECTED]> wrote:

You don't need 0.17 to use the scripts mentioned in the EC2 wiki page.
Just
grab contrib/ec2 from the 0.17.0 branch.

as for images, you will need to update the image/create-hadoop- remote
bash
script to download and install hbase.

and update hadoop-init to start it with the proper properties.

once you look at these scripts, it should be fairly obvious what you
need
to do.

then just run 'create-image' command to stuff this new image into one
of
your buckets.

enjoy
ckw



On May 7, 2008, at 11:12 AM, Jim R. Wilson wrote:



Hi all,

I'm about to embark on a mystical journey through hosted
web-services
with my trusted friend hbase. Here are some questions for my fellow
travelers:

1) Has anyone done this before? If so, what lifesaving tips can you
offer?
2) Should I attempt to build an hdfs out of ec2 persistent storage,
or
just use S3?
3) How many images will I need? Just one, or master/slave?
4) What version of hadoop/hbase should I use?  (The hadoop/ec2
instructions[1] seem to favor the unreleased 0.17, but there doesn't
seem to be a public image with 0.17 at the ready)

Thanks in advance for any advice, I'm gearing up for quite a trip :)

[1] http://wiki.apache.org/hadoop/AmazonEC2

-- Jim R. Wilson (jimbojw)



Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/










Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/






Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/




Reply via email to