The Initial job has not accepted any resources error; can't seem to set

2015-06-18 Thread dgoldenberg
Hi, I'm running Spark Standalone on a single node with 16 cores. Master and 4 workers are running. I'm trying to submit two applications via spark-submit and am getting the following error when submitting the second one: Initial job has not accepted any resources; check your cluster UI to ensure

Re: The Initial job has not accepted any resources error; can't seem to set

2015-06-18 Thread dgoldenberg
I just realized that --conf needs to be one key-value pair per line. And somehow I needed --conf spark.cores.max=2 \ However, when it was --conf spark.deploy.defaultCores=2 \ then one job would take up all 16 cores on the box. What's the actual model here? We've got 10 apps