Re: Can I assign affinity for spark executor processes?

2016-09-19 Thread Xiaoye Sun
Hi Jakob, Yes. you are right. I should use taskset when I start the *.sh scripts. For more detail, I change the last line in ./sbin/start-slaves.sh on master to this "${SPARK_HOME}/sbin/slaves.sh" cd "${SPARK_HOME}" \; *"taskset" "0xffe"* "${SPARK_HOME}/sbin/start-slave.sh"

Re: Can I assign affinity for spark executor processes?

2016-09-13 Thread Jakob Odersky
Hi Xiaoye, could it be that the executors were spawned before the affinity was set on the worker? Would it help to start spark worker with taskset from the beginning, i.e. "taskset [mask] start-slave.sh"? Workers in spark (standalone mode) simply create processes with the standard java process

Can I assign affinity for spark executor processes?

2016-09-13 Thread Xiaoye Sun
Hi, In my experiment, I pin one very important process on a fixed CPU. So the performance of Spark task execution will be affected if the executors or the worker uses that CPU. I am wondering if it is possible to let the Spark executors not using a particular CPU. I tried to 'taskset -p

Can I assign affinity for spark executor processes?

2016-09-13 Thread Xiaoye Sun
Hi, In my experiment, I pin one very important process on a fixed CPU. So the performance of Spark task execution will be affected if the executors or the worker uses that CPU. I am wondering if it is possible to let the Spark executors not using a particular CPU. I tried to 'taskset -p