You have one worker with one executor with 32 execution slots. On Mon, May 11, 2015 at 9:52 PM, dgoldenberg <dgoldenberg...@gmail.com> wrote: > Hi, > > Is there anything special one must do, running locally and submitting a job > like so: > > spark-submit \ > --class "com.myco.Driver" \ > --master local[*] \ > ./lib/myco.jar > > In my logs, I'm only seeing log messages with the thread identifier of > "Executor task launch worker-0". > > There are 4 cores on the machine so I expected 4 threads to be at play. > Running with local[32] did not yield 32 worker threads. > > Any recommendations? Thanks. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-local-mode-seems-to-ignore-local-N-tp22851.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org >
--------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org