Thanks, Robert.

I'm using Fedora, so it probably works the same way as you suggest.  Setting
the ulimit and xcievers as described in the troubleshooting didn't seem to
help.  But I'm going to try again with your suggestion.

Marc

On Thu, Jan 7, 2010 at 12:56 PM, Andrew Purtell <[email protected]> wrote:

> Robert,
>
> Thanks for that. I updated the relevant section of the Troubleshooting page
> up on the HBase wiki with this advice.
>
> Best regards,
>
>   - Andy
>
>
>
> ----- Original Message ----
> > From: "Gibbon, Robert, VF-Group" <[email protected]>
> > To: [email protected]
> > Sent: Thu, January 7, 2010 5:04:58 AM
> > Subject: RE: Seeing errors after loading a fair amount of data.
> KeeperException$NoNodeException, IOException
> >
> > Maybe you are running Red Hat? Just changing limits.conf I think won't
> > work because RH has a maximum total open files across the whole system,
> > which is 4096 by default, unless you do something like this too
> >
> > echo "32768" > /proc/sys/fs/file-max
> > service network restart
> >
> > To make it permanent edit /etc/sysctl.conf to include the line:
> >       fs.file-max = 32768
> > Kind regards,
> > Robert
> [...]
>
>
>
>
>

Reply via email to