Hi recompiled and retried, now its looking like this with s3a :
com.amazonaws.AmazonClientException: Unable to load AWS credentials
from any provider in the chain
S3n is working find, (only problem is still the endpoint)
-
To
at 1:54 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi recompiled and retried, now its looking like this with s3a :
com.amazonaws.AmazonClientException: Unable to load AWS credentials
from any provider in the chain
S3n is working find, (only problem is still the endpoint)
...@sigmoidanalytics.com:
So you are able to access your AWS S3 with s3a now? What is the error that
you are getting when you try to access the custom storage with
fs.s3a.endpoint?
Thanks
Best Regards
On Mon, Jul 27, 2015 at 2:44 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
I was able to access
(fs.s3a.awsSecretAccessKey,)
Any Idea why it doesn't work ?
2015-07-20 18:11 GMT+02:00 Schmirr Wurst schmirrwu...@gmail.com:
Thanks, that is what I was looking for...
Any Idea where I have to store and reference the corresponding
hadoop-aws-2.6.0.jar ?:
java.io.IOException: No FileSystem for scheme: s3n
2015-07
your spark job add --jars path/to/thejar
From: Schmirr Wurst schmirrwu...@gmail.com
Sent: Wednesday, July 22, 2015 12:06 PM
To: Thomas Demoor
Subject: Re: use S3-Compatible Storage with spark
Hi Thomas, thanks, could you just tell me what exaclty I
its more like an issue with hadoop.
Thanks
Best Regards
On Tue, Jul 21, 2015 at 2:31 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
It seems to work for the credentials , but the endpoint is ignored.. :
I've changed it to
sc.hadoopConfiguration.set(fs.s3n.endpoint,test.com)
And I continue
...@sigmoidanalytics.com:
You can add the jar in the classpath, and you can set the property like:
sc.hadoopConfiguration.set(fs.s3a.endpoint,storage.sigmoid.com)
Thanks
Best Regards
On Mon, Jul 20, 2015 at 9:41 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Thanks, that is what I was looking for...
Any
Thanks
Best Regards
On Sun, Jul 19, 2015 at 9:13 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
I want to use pithos, were do I can specify that endpoint, is it
possible in the url ?
2015-07-19 17:22 GMT+02:00 Akhil Das ak...@sigmoidanalytics.com:
Could you name the Storage service that you
On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst schmirrwu...@gmail.com
wrote:
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere
Hi,
I wonder how to use S3 compatible Storage in Spark ?
If I'm using s3n:// url schema, the it will point to amazon, is there
a way I can specify the host somewhere ?
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
10 matches
Mail list logo