On Tue, Oct 27, 2015 at 10:43 AM, Jerry Lam <chiling...@gmail.com> wrote:
> Anyone experiences issues in setting hadoop configurations after
> SparkContext is initialized? I'm using Spark 1.5.1.
>
> I'm trying to use s3a which requires access and secret key set into hadoop
> configuration. I tried to set the properties in the hadoop configuration
> from sparktcontext.
>
> sc.hadoopConfiguration.set("fs.s3a.access.key", AWSAccessKeyId)
> sc.hadoopConfiguration.set("fs.s3a.secret.key", AWSSecretKey)

Try setting "spark.hadoop.fs.s3a.access.key" and
"spark.hadoop.fs.s3a.secret.key" in your SparkConf before creating the
SparkContext.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to