Changed to infinity and the error seems gone. Will keep monitoring for a
few days ..
Thanks guys!
Rao
On Tue, Feb 21, 2017 at 3:42 AM, Lennart Poettering
wrote:
> On Mon, 20.02.17 16:44, Rao Vz (raoa...@gmail.com) wrote:
>
> > Hi, Guys
> >
> > We have a Apache Spark
On Mon, 20.02.17 16:44, Rao Vz (raoa...@gmail.com) wrote:
> Hi, Guys
>
> We have a Apache Spark cluster of 3 nodes, one is master and slave, the
> other two are slaves. When starting Spark worker with "systemctl start
> spark-worker", when running out apps, sometimes but not always it generates
El 20-02-2017 a las 18:44, Rao Vz escribió:
> Hi, Guys
> Any help is appreciated.
>
Most likely you went over TasksMax.
___
systemd-devel mailing list
systemd-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/systemd-devel
Hi, Guys
We have a Apache Spark cluster of 3 nodes, one is master and slave, the
other two are slaves. When starting Spark worker with "systemctl start
spark-worker", when running out apps, sometimes but not always it generates
"java.lang.OutOfMemoryError: unable to create new native thread"