Solution: sc._jsc.hadoopConfiguration().set("fs.s3a.awsAccessKeyId", "...") sc._jsc.hadoopConfiguration().set("fs.s3a.awsSecretAccessKey", "...")
Got this solution from a cloudera lady. Thanks Neerja. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-0-read-s3-files-error-tp27417p27452.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org