Streaming: limit number of nodes
Ok, thanks. I have 1 worker process on each machine but I would like to run my
app on only 3 of them. Is it possible?
śr., 24.06.2015 o 11:44 użytkownik Evo Eftimov napisał:
There is no direct one to one mapping between Executor and Node
Executor is simply
Ok, thanks. I have 1 worker process on each machine but I would like to run
my app on only 3 of them. Is it possible?
śr., 24.06.2015 o 11:44 użytkownik Evo Eftimov
napisał:
> There is no direct one to one mapping between Executor and Node
>
>
>
> Executor is simply the spark framework term for
There is no direct one to one mapping between Executor and Node
Executor is simply the spark framework term for JVM instance with some spark
framework system code running in it
A node is a physical server machine
You can have more than one JVM per node
And vice versa you can hav
I can not. I've already limited the number of cores to 10, so it gets 5
executors with 2 cores each...
wt., 23.06.2015 o 13:45 użytkownik Akhil Das
napisał:
> Use *spark.cores.max* to limit the CPU per job, then you can easily
> accommodate your third job also.
>
> Thanks
> Best Regards
>
> On T
Use *spark.cores.max* to limit the CPU per job, then you can easily
accommodate your third job also.
Thanks
Best Regards
On Tue, Jun 23, 2015 at 5:07 PM, Wojciech Pituła wrote:
> I have set up small standalone cluster: 5 nodes, every node has 5GB of
> memory an 8 cores. As you can see, node doe