I see that StreamingContext has a hadoopConfiguration() method, which can
be used like this sample I found:

sc.hadoopConfiguration().set("fs.s3.awsAccessKeyId", "XXXXXX");
> sc.hadoopConfiguration().set("fs.s3.awsSecretAccessKey", "XXXXXX");


But StreamingContext doesn't have the same thing.  I want to use a
StreamingContext with s3n: text file input, but can't find a way to set the
AWS credentials.  I also tried (with no success):


   - adding the properties to conf/spark-defaults.conf
   - $HADOOP_HOME/conf/hdfs-site.xml
   - ENV variables
   - Embedded as user:password in s3n://user:password@... (w/ url encoding)
   - Setting the conf as above on a new SparkContext and passing that the
   StreamingContext constructor: StreamingContext(sparkContext: SparkContext,
   batchDuration: Duration)

Can someone point me in the right direction for setting AWS creds (hadoop
conf options) for streamingcontext?

thanks,
Marc Limotte
Climate Corporation

Reply via email to