Adding whirr.env.repo=cdh3u4 worked out, thanks.

Andrii

On Wed, May 16, 2012 at 1:29 PM, Andrei Savu <[email protected]> wrote:

> Add whirr.env.repo=cdh3u4 to your recipe. It should make things work.
>
> On Wed, May 16, 2012 at 1:23 PM, Andrii Vozniuk <[email protected]>
> wrote:
>
> > Andrei,
> >
> > I've successfully built the trunk and trying to launch a cluster with the
> > scripts.
> >
> > I'm using hadoop-0.20.2-cdh3u4, hence, I've uncommented the following
> lines
> > in recipes/hadoop.properties:
> >
> > whirr.hadoop.install-function=install_cdh_hadoop
> > whirr.hadoop.configure-function=configure_cdh_hadoop
> >
> > Now, I'm launching the cluster using this config. Machines start, but
> > Hadoop doesn't. Now, I examine the logs on EC2 machines and I find the
> > following lines in
> > /tmp/bootstrap-hadoop-namenode_hadoop-jobtracker/stderr.log:
> >
> > + which dpkg
> > + apt-get update
> > W: Failed to fetch
> >
> >
> http://archive.cloudera.com/debian/dists/lucid-cdh4/contrib/binary-i386/Packages.gz
> > 404  Not Found
> >
> > W: Failed to fetch
> >
> >
> http://archive.cloudera.com/debian/dists/lucid-cdh4/contrib/source/Sources.gz
> > 404  Not Found
> >
> > E: Some index files failed to download, they have been ignored, or old
> ones
> > used instead.
> > + apt-get -y install hadoop-0.20-mapreduce
> > E: Couldn't find package hadoop-0.20-mapreduce
> >
> > Obviously, the links don't exist. What should I do to make the script
> work?
> >
> > Cheers,
> > Andrii Vozniuk
> >
>



-- 
Best regards
Andrii Vozniuk

Reply via email to