don't know the size of the data asI don't know the command to check.

But can we follow this blog to export and then import
https://blog.clairvoyantsoft.com/hbase-incremental-table-backup-and-disaster-recovery-using-aws-s3-storage-aa2bc1b40744

On Thu, May 4, 2023 at 11:57 AM Davide Vergari <vergari.dav...@gmail.com>
wrote:

> If  hbase tables you can create a snapshot for each table then export with
> the ExportSnapshot mapreduce job (should be already available on 0.98.x).
> For data that are not in hbase you can use distcp
>
> Il giorno gio 4 mag 2023 alle ore 17:13 <s...@comcast.net> ha scritto:
>
> > Jignesh,   how much data?      Is the data currently in hbase format?
> >  Very kindly,  Sean
> >
> >
> > > On 05/04/2023 11:03 AM Jignesh Patel <jigneshmpa...@gmail.com> wrote:
> > >
> > >
> > > We are in the process of having hadoop os, however we are using a very
> > old
> > > version of hadoop.
> > > Hadoop 2.6
> > > and HBase 0.98.7.
> > >
> > > So how do we export and import the data from the cluster with the old
> OS
> > to
> > > the new OS. We are trying to use the same hadoop/hbase version.
> > >
> > > -Jignesh
> >
>

Reply via email to