p stages are running and pending, this makes your node out of
> file handler.
>
> You could check Spark web portal to see if there's several map stages
> running simultaneously, or some of them are running while others are
> pending.
>
> Thanks
> Jerry
>
>
> On Wed, Sep 2, 20
, Steve Loughran <ste...@hortonworks.com>
wrote:
>
> On 31 Aug 2015, at 19:49, Sigurd Knippenberg <sig...@knippenberg.com>
> wrote:
>
> I know I can adjust the max open files allowed by the OS but I'd rather
> fix the underlaying issue.
>
>
>
> bumping up the
I am running in a 'too many open files' issue and before I posted this I
have searched on the web to see if anyone had a solution already to my
particular problem but I did not see anything that helped.
I know I can adjust the max open files allowed by the OS but I'd rather fix
the underlaying