One approach would be to set these environment variables inside the
spark-env.sh in all workers then you can access them using the
System.getEnv("WHATEVER")

Thanks
Best Regards


On Wed, Aug 20, 2014 at 9:49 PM, Darin McBeath <ddmcbe...@yahoo.com.invalid>
wrote:

> Can't seem to figure this out.  I've tried several different approaches
> without success. For example, I've tried
> setting spark.executor.extraJavaOptions in the spark-default.conf (prior to
> starting the spark-shell) but this seems to have no effect.
>
> Outside of spark-shell (within a java application I wrote), I successfully
> do the following:
>
> // Set environment variables for the executors
> conf.setExecutorEnv("AWS_ACCESS_KEY_ID", System.getenv("AWS_ACCESS_KEY_ID"
> ));
> conf.setExecutorEnv("AWS_SECRET_ACCESS_KEY", System.getenv(
> "AWS_SECRET_ACCESS_KEY"));
>
>
> But, because my SparkContext already exists within spark-shell, this
> really isn't an option (unless I'm missing something).
>
> Thanks.
>
> Darin.
>
>
>

Reply via email to