Can't seem to figure this out.  I've tried several different approaches without 
success. For example, I've tried setting spark.executor.extraJavaOptions in the 
spark-default.conf (prior to starting the spark-shell) but this seems to have 
no effect.

Outside of spark-shell (within a java application I wrote), I successfully do 
the following:

// Set environment variables for the executors
conf.setExecutorEnv("AWS_ACCESS_KEY_ID", System.getenv("AWS_ACCESS_KEY_ID"));
conf.setExecutorEnv("AWS_SECRET_ACCESS_KEY", 
System.getenv("AWS_SECRET_ACCESS_KEY"));


But, because my SparkContext already exists within spark-shell, this really 
isn't an option (unless I'm missing something).  

Thanks.

Darin.

Reply via email to