Thanks, but, how to increase the tasks per core?
For example, if the application claims 10 cores, is it possible to launch
100 tasks concurrently?
On Fri, Jan 9, 2015 at 2:57 PM, Jörn Franke jornfra...@gmail.com wrote:
Hallo,
Based on experiences with other software in virtualized
Hi,
As you said, the --executor-cores will define the max number of tasks that
an executor can take simultaneously. So, if you claim 10 cores, it is not
possible to launch more than 10 tasks in an executor at the same time.
According to my experience, set cores more than physical CPU core will
Hallo,
Based on experiences with other software in virtualized environments I
cannot really recommend this. However, I am not sure how Spark reacts. You
may face unpredictable task failures depending on utilization, tasks
connecting to external systems (databases etc.) may fail unexpectedly and
Hi,
I'm wondering whether it is a good idea to overcommit CPU cores on
the spark cluster.
For example, in our testing cluster, each worker machine has 24
physical CPU cores. However, we are allowed to set the CPU core number to
48 or more in the spark configuration file. As a result,