You’ll want to setup the FAIR scheduler as described here: 
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application

From: yael aharon <yael.aharo...@gmail.com<mailto:yael.aharo...@gmail.com>>
Date: Friday, February 12, 2016 at 2:00 PM
To: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Allowing parallelism in spark local mode

Hello,
I have an application that receives requests over HTTP and uses spark in local 
mode to process the requests. Each request is running in its own thread.
It seems that spark is queueing the jobs, processing them one at a time. When 2 
requests arrive simultaneously, the processing time for each of them is almost 
doubled.
I tried setting spark.default.parallelism, spark.executor.cores, 
spark.driver.cores but that did not change the time in a meaningful way.

Am I missing something obvious?
thanks, Yael

Reply via email to