Bill,

Thank you.  Unfortunately, that will not work for us, as our hadoop-19 has
long been finalized.

dbr

On Wed, Mar 18, 2009 at 2:35 PM, Bill Au <bill.w...@gmail.com> wrote:

> As long as the version upgrade has not been finalized, here's the procedure
> I use to downgrade:
>
>
>   1. make sure that the previous version upgrade has not been finalized. A
>   version upgrade cannot be rollback if it has been finalized.
>
>   bin/hadoop dfsadmin -upgradeProgress status
>
>   2. stop map-reduce cluster.
>
>   bin/stop-mapred.sh
>
>   3. stop all applications and make sure that there are no running tasks.
>   4. stop HDFS cluster.
>
>   bin/stop-dfs.sh
>
>   5. rollback version of Hadoop. Install previous version of Hadoop if
>   previous version of Hadoop has been removed from the system.
>   6. start HDFS with the rollback option.
>
>   bin/start-dfs.sh -rollback
>
>   7. monitor the rollback until it is complete.
>
>   bin/hadoop dfsadmin -upgradeProgress status
>
>   8. start map-reduce cluster.
>
>   bin/start-mapred.sh
>
>
>
> Bill
>
> On Wed, Mar 18, 2009 at 1:08 PM, David Ritch <david.ri...@gmail.com>
> wrote:
>
> > There is an established procedure for upgrading from one release of
> Hadoop
> > to a newer release.  Is there something similar to move back to an
> > lower-numered release?
> >
> > Specifically, we have data in a cloud running Hadoop-19.0.  Because of
> > stability issues, we are wondering whether we should move back to 18, but
> > we
> > don't want to lose our data.  Is there a downward migration path?
> >
> > Thanks,
> >
> > David
> >
>

Reply via email to