Question about SPARK_WORKER_CORES and spark.task.cpus

2015-06-22 Thread Rui Li
Hi, I was running a WordCount application on Spark, and the machine I used has 4 physical cores. However, in spark-env.sh file, I set SPARK_WORKER_CORES = 32. The web UI says it launched one executor with 32 cores and the executor could execute 32 tasks simultaneously. Does spark create 32

RE: Question about SPARK_WORKER_CORES and spark.task.cpus

2015-06-22 Thread Cheng, Hao
. (For example, run a multithreaded external app within each task). Hope it helpful. From: Rui Li [mailto:spark.ru...@gmail.com] Sent: Tuesday, June 23, 2015 8:56 AM To: user@spark.apache.org Subject: Question about SPARK_WORKER_CORES and spark.task.cpus Hi, I was running a WordCount application