How many mesos slaves are you having? and how many cores are you having in
total?

      sparkConf.set("spark.mesos.coarse", "true")
      sparkConf.set("spark.cores.max", "128")

These two configurations are sufficient. Now regarding the active tasks,
how many partitions are you seeing for that job? You can try doing a
dstream.repartition to see if it increase from 11 to a higher number.


Thanks
Best Regards

On Thu, Aug 20, 2015 at 2:28 AM, swetha <swethakasire...@gmail.com> wrote:

> Hi,
>
> How to set the number of executors and tasks in a Spark Streaming job in
> Mesos? I have the following settings but my job still shows me 11 active
> tasks and 11 executors. Any idea as to why this is happening
> ?
>
>  sparkConf.set("spark.mesos.coarse", "true")
>       sparkConf.set("spark.cores.max", "128")
>       sparkConf.set("spark.default.parallelism", "100")
>       //sparkConf.set("spark.locality.wait", "0")
>       sparkConf.set("spark.executor.memory", "32g")
>       sparkConf.set("spark.streaming.unpersist", "true")
>       sparkConf.set("spark.shuffle.io.numConnectionsPerPeer", "1")
>       sparkConf.set("spark.rdd.compress", "true")
>       sparkConf.set("spark.shuffle.memoryFraction", ".6")
>       sparkConf.set("spark.storage.memoryFraction", ".2")
>       sparkConf.set("spark.shuffle.spill", "true")
>       sparkConf.set("spark.shuffle.spill.compress", "true")
>       sparkConf.set("spark.streaming.receiver.writeAheadLog.enable",
> "true")
>       sparkConf.set("spark.streaming.blockInterval", "400")
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-number-of-executors-and-tasks-in-a-Spark-Streaming-job-in-Mesos-tp24348.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to