Thanks, Akhil.
We're trying the conf.setExecutorEnv() approach since we've already got
environment variables set. For system properties we'd go the
conf.set(spark.) route.
We were concerned that doing the below type of thing did not work, which
this blog post seems to confirm (
I have about 20 environment variables to pass to my Spark workers. Even
though they're in the init scripts on the Linux box, the workers don't see
these variables.
Does Spark do something to shield itself from what may be defined in the
environment?
I see multiple pieces of info on how to pass