Hi,

Try starting your clusters with roles, and you will not have to configure,
hard code anything at all.

Let me know in case you need any help with this.


Regards,
Gourav Sengupta

On Tue, Mar 15, 2016 at 11:32 AM, Yasemin Kaya <godo...@gmail.com> wrote:

> Hi Safak,
>
> I changed the Keys but there is no change.
>
> Best,
> yasemin
>
>
> 2016-03-15 12:46 GMT+02:00 Şafak Serdar Kapçı <sska...@gmail.com>:
>
>> Hello Yasemin,
>> Maybe your key id or access key has special chars like backslash or
>> something. You need to change it.
>> Best Regards,
>> Safak.
>>
>> 2016-03-15 12:33 GMT+02:00 Yasemin Kaya <godo...@gmail.com>:
>>
>>> Hi,
>>>
>>> I am using Spark 1.6.0 standalone and I want to read a txt file from S3
>>> bucket named yasemindeneme and my file name is deneme.txt. But I am getting
>>> this error. Here is the simple code
>>> <https://gist.github.com/anonymous/6d174f8587f0f3fd2334>
>>> Exception in thread "main" java.lang.IllegalArgumentException: Invalid
>>> hostname in URI s3n://AWS_ACCESS_KEY_ID:AWS_SECRET_ACCESS_KEY@
>>> /yasemindeneme/deneme.txt
>>> at
>>> org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:45)
>>> at
>>> org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.initialize(Jets3tNativeFileSystemStore.java:55)
>>>
>>>
>>> I try 2 options
>>> *sc.hadoopConfiguration() *and
>>> *sc.textFile("s3n://AWS_ACCESS_KEY_ID:AWS_SECRET_ACCESS_KEY@/yasemindeneme/deneme.txt/");*
>>>
>>> Also I did export AWS_ACCESS_KEY_ID= .....
>>>  export AWS_SECRET_ACCESS_KEY=
>>> But there is no change about error.
>>>
>>> Could you please help me about this issue?
>>>
>>>
>>> --
>>> hiç ender hiç
>>>
>>
>>
>
>
> --
> hiç ender hiç
>

Reply via email to