Hi John,
I don't think this is specific to Mesos.
Note that `spark-defaults.conf` are only defaults. Normally you'd pass your
specific options using `--conf`. Does that work?
iulian
On Thu, Nov 12, 2015 at 3:05 PM, John Omernik wrote:
> Hey all,
>
> I noticed today that if
I have run in a related issue I think, args passed to spark-submit to my
cluster dispatcher get lost in translation when lauching the driver from
mesos, I'm suggesting this patch:
https://github.com/jayv/spark/commit/b2025ddc1d565d1cc3036200fc3b3046578f4b02
- Jo Voordeckers
On Thu, Nov 12,
Hey all,
I noticed today that if I take a tgz as my URI for Mesos, that I have to
repackaged it with my conf settings from where I execute say pyspark for
the executors to have the right configuration settings.
That is...
If I take a "stock" tgz from makedistribution.sh, unpack it, and then set