I am running in a 'too many open files' issue and before I posted this I
have searched on the web to see if anyone had a solution already to my
particular problem but I did not see anything that helped.
I know I can adjust the max open files allowed by the OS but I'd rather fix
the underlaying iss
2015 at 4:33 AM, Steve Loughran
wrote:
>
> On 31 Aug 2015, at 19:49, Sigurd Knippenberg
> wrote:
>
> I know I can adjust the max open files allowed by the OS but I'd rather
> fix the underlaying issue.
>
>
>
> bumping up the OS handle limits is step #1 of installing a
ral map stages are running and pending, this makes your node out of
> file handler.
>
> You could check Spark web portal to see if there's several map stages
> running simultaneously, or some of them are running while others are
> pending.
>
> Thanks
> Jerry
>
>