I'm not able to actually look inside the folder as java runs out of memory
trying to do a directory listing...I haven't had more time to look into the
problem.

On Thu, Feb 26, 2015 at 12:56 AM, Madeleine Piffaretti <
mpiffare...@powerspace.com> wrote:

> Hi,
>
> The replication is not turned on HBase...
> Does this folder should be clean regularly? Because I have data from
> december 2014...
>
>
> 2015-02-26 1:40 GMT+01:00 Liam Slusser <lslus...@gmail.com>:
>
> > I'm having this same problem.  I had replication enabled but have since
> > been disabled.  However oldWALs still grows.  There are so many files in
> > there that running "hadoop fs -ls /hbase/oldWALs" runs out of memory.
> >
> > On Wed, Feb 25, 2015 at 9:27 AM, Nishanth S <nishanth.2...@gmail.com>
> > wrote:
> >
> > > Do you have replication turned on in hbase and  if so is your slave
> > >  consuming the replicated data?.
> > >
> > > -Nishanth
> > >
> > > On Wed, Feb 25, 2015 at 10:19 AM, Madeleine Piffaretti <
> > > mpiffare...@powerspace.com> wrote:
> > >
> > > > Hi all,
> > > >
> > > > We are running out of space in our small hadoop cluster so I was
> > checking
> > > > disk usage on HDFS and I saw that most of the space was occupied by
> > the*
> > > > /hbase/oldWALs* folder.
> > > >
> > > > I have checked in the "HBase Definitive Book" and others books,
> > web-site
> > > > and I have also search my issue on google but I didn't find a proper
> > > > response...
> > > >
> > > > So I would like to know what does this folder, what is use for and
> also
> > > how
> > > > can I free space from this folder without breaking everything...
> > > >
> > > >
> > > > If it's related to a specific version... our cluster is under
> > > > 5.3.0-1.cdh5.3.0.p0.30 from cloudera (hbase 0.98.6).
> > > >
> > > > Thx for your help!
> > > >
> > >
> >
>

Reply via email to