I'm running a spark-ec2 cluster.

I have a map task that calls a specialized C++ external app. The app doesn't
fully utilize the core as it needs to download/upload data as part of the
task. Looking at the worker nodes, it appears that there is one task with my
app running per core.

I'd like to better utilize the cpu resources with the hope of increasing
throughput by running multiple tasks (with my app) per core in parallel.

I see there is a spark.task.cpus config setting with a default value of 1.
It appears though that this is used to go the other way than what I am
looking for.

Is there a way where I can specify multiple tasks per core rather than
multiple cores per task?

thanks for any help.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/configure-to-run-multiple-tasks-on-a-core-tp19834.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to