Whats in your executor (that .tgz file) conf/spark-default.conf file?

Thanks
Best Regards

On Mon, Jun 15, 2015 at 7:14 PM, Gary Ogden <gog...@gmail.com> wrote:

> I'm loading these settings from a properties file:
> spark.executor.memory=256M
> spark.cores.max=1
> spark.shuffle.consolidateFiles=true
> spark.task.cpus=1
> spark.deploy.defaultCores=1
> spark.driver.cores=1
> spark.scheduler.mode=FAIR
>
> Once the job is submitted to mesos, I can go to the spark UI for that job
> (hostname:4040) and on the environment tab. I see that those settings are
> there.
>
> If I then comment out all those settings and allow spark to use the
> defaults, it still appears to use the same settings in mesos.
>
> Under both runs, it still shows 1 task, 3 cpu, 1GB memory.
>
> Nothing seems to change no matter what is put in that props file, even if
> they show up in the spark environment tab.
>

Reply via email to