Spark is inventing its own AWS secret key

2017-03-08 Thread Jonhy Stack
Hi, I'm trying to read a s3 bucket from Spark and up until today Spark always complain that the request return 403 hadoopConf = spark_context._jsc.hadoopConfiguration() hadoopConf.set("fs.s3a.access.key", "ACCESSKEY") hadoopConf.set("fs.s3a.secret.key", "SECRETKEY")

(python) Spark .textFile(s3://…) access denied 403 with valid credentials

2017-03-07 Thread Jonhy Stack
In order to access my S3 bucket i have exported my creds export AWS_SECRET_ACCESS_KEY= export AWS_ACCESSS_ACCESS_KEY= I can verify that everything works by doing aws s3 ls mybucket I can also verify with boto3 that it works in python resource = boto3.resource("s3",