I have set up small standalone cluster: 5 nodes, every node has 5GB of
memory an 8 cores. As you can see, node doesn't have much RAM.
I have 2 streaming apps, first one is configured to use 3GB of memory per
node and second one uses 2GB per node.
My problem is, that smaller app could easily run o
Use *spark.cores.max* to limit the CPU per job, then you can easily
accommodate your third job also.
Thanks
Best Regards
On Tue, Jun 23, 2015 at 5:07 PM, Wojciech Pituła wrote:
> I have set up small standalone cluster: 5 nodes, every node has 5GB of
> memory an 8 cores. As you can see, node doe
I can not. I've already limited the number of cores to 10, so it gets 5
executors with 2 cores each...
wt., 23.06.2015 o 13:45 użytkownik Akhil Das
napisał:
> Use *spark.cores.max* to limit the CPU per job, then you can easily
> accommodate your third job also.
>
> Thanks
> Best Regards
>
> On T
standalone mode the number of executors is equal to the number of
spark worker processes (daemons) running on each node
From: Wojciech Pituła [mailto:w.pit...@gmail.com]
Sent: Tuesday, June 23, 2015 12:38 PM
To: user@spark.apache.org
Subject: Spark Streaming: limit number of nodes
I have
rk in standalone mode the number of executors is equal to the
> number of spark worker processes (daemons) running on each node
>
>
>
> *From:* Wojciech Pituła [mailto:w.pit...@gmail.com]
> *Sent:* Tuesday, June 23, 2015 12:38 PM
> *To:* user@spark.apache.org
> *Subject:* Spark Str
(daemons) running on each node
From: Wojciech Pituła [mailto:w.pit...@gmail.com]
Sent: Tuesday, June 23, 2015 12:38 PM
To: user@spark.apache.org
Subject: Spark Streaming: limit number of nodes
I have set up small standalone cluster: 5 nodes, every node has 5GB of memory
an 8 cores. As you