Which version do you have ?

- I tried with spark 1.4.1 for hdp 2.6, but here I had an issue that
the aws-module is not there somehow:
java.io.IOException: No FileSystem for scheme: s3n
the same for s3a :
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
org.apache.hadoop.fs.s3a.S3AFileSystem not found

- On Spark 1.4.1 for hdp 2.4 , the module is there, and works out of
the box for S3n (but for the endpoint)
But I have "java.io.IOException: No FileSystem for scheme: s3a"

:-|

2015-07-21 11:09 GMT+02:00 Akhil Das <ak...@sigmoidanalytics.com>:
> Did you try with s3a? It seems its more like an issue with hadoop.
>
> Thanks
> Best Regards
>
> On Tue, Jul 21, 2015 at 2:31 PM, Schmirr Wurst <schmirrwu...@gmail.com>
> wrote:
>>
>> It seems to work for the credentials , but the endpoint is ignored.. :
>> I've changed it to
>> sc.hadoopConfiguration.set("fs.s3n.endpoint","test.com")
>>
>> And I continue to get my data from amazon, how could it be ? (I also
>> use s3n in my text url)
>>
>> 2015-07-21 9:30 GMT+02:00 Akhil Das <ak...@sigmoidanalytics.com>:
>> > You can add the jar in the classpath, and you can set the property like:
>> >
>> > sc.hadoopConfiguration.set("fs.s3a.endpoint","storage.sigmoid.com")
>> >
>> >
>> >
>> > Thanks
>> > Best Regards
>> >
>> > On Mon, Jul 20, 2015 at 9:41 PM, Schmirr Wurst <schmirrwu...@gmail.com>
>> > wrote:
>> >>
>> >> Thanks, that is what I was looking for...
>> >>
>> >> Any Idea where I have to store and reference the corresponding
>> >> hadoop-aws-2.6.0.jar ?:
>> >>
>> >> java.io.IOException: No FileSystem for scheme: s3n
>> >>
>> >> 2015-07-20 8:33 GMT+02:00 Akhil Das <ak...@sigmoidanalytics.com>:
>> >> > Not in the uri, but in the hadoop configuration you can specify it.
>> >> >
>> >> > <property>
>> >> >   <name>fs.s3a.endpoint</name>
>> >> >   <description>AWS S3 endpoint to connect to. An up-to-date list is
>> >> >     provided in the AWS Documentation: regions and endpoints. Without
>> >> > this
>> >> >     property, the standard region (s3.amazonaws.com) is assumed.
>> >> >   </description>
>> >> > </property>
>> >> >
>> >> >
>> >> > Thanks
>> >> > Best Regards
>> >> >
>> >> > On Sun, Jul 19, 2015 at 9:13 PM, Schmirr Wurst
>> >> > <schmirrwu...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> I want to use pithos, were do I can specify that endpoint, is it
>> >> >> possible in the url ?
>> >> >>
>> >> >> 2015-07-19 17:22 GMT+02:00 Akhil Das <ak...@sigmoidanalytics.com>:
>> >> >> > Could you name the Storage service that you are using? Most of
>> >> >> > them
>> >> >> > provides
>> >> >> > a S3 like RestAPI endpoint for you to hit.
>> >> >> >
>> >> >> > Thanks
>> >> >> > Best Regards
>> >> >> >
>> >> >> > On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst
>> >> >> > <schmirrwu...@gmail.com>
>> >> >> > wrote:
>> >> >> >>
>> >> >> >> Hi,
>> >> >> >>
>> >> >> >> I wonder how to use S3 compatible Storage in Spark ?
>> >> >> >> If I'm using s3n:// url schema, the it will point to amazon, is
>> >> >> >> there
>> >> >> >> a way I can specify the host somewhere ?
>> >> >> >>
>> >> >> >>
>> >> >> >>
>> >> >> >> ---------------------------------------------------------------------
>> >> >> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> >> >> >> For additional commands, e-mail: user-h...@spark.apache.org
>> >> >> >>
>> >> >> >
>> >> >>
>> >> >>
>> >> >> ---------------------------------------------------------------------
>> >> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> >> >> For additional commands, e-mail: user-h...@spark.apache.org
>> >> >>
>> >> >
>> >
>> >
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to