Making a guess here: you need to add s3:ListBucket?
http://stackoverflow.com/questions/35803808/spark-saveastextfile-to-s3-fails
On Thu, Nov 17, 2016 at 2:11 PM, Jain, Nishit
wrote:
> When I read a specific file it works:
>
> val filePath=
When I read a specific file it works:
val filePath= "s3n://bucket_name/f1/f2/avro/dt=2016-10-19/hr=19/00"
val df = spark.read.avro(filePath)
But if I point to a folder to read date partitioned data it fails:
val filePath="s3n://bucket_name/f1/f2/avro/dt=2016-10-19/"
I get this error: