Please see this related thread:

http://search-hadoop.com/m/q3RTtSYa3F1OT6H&subj=DirectFileOutputCommiter

On Mon, Mar 21, 2016 at 7:45 AM, Yasemin Kaya <godo...@gmail.com> wrote:

> Hi Ted,
>
> I don't understand the issue that you want to learn? Could you be more
> clear please?
>
>
>
>
>
> 2016-03-21 15:24 GMT+02:00 Ted Yu <yuzhih...@gmail.com>:
>
>> Was speculative execution enabled ?
>>
>> Thanks
>>
>> On Mar 21, 2016, at 6:19 AM, Yasemin Kaya <godo...@gmail.com> wrote:
>>
>> Hi,
>>
>> I am using S3 read data also I want to save my model S3. In reading part
>> there is no error, but when I save model I am getting this error
>> <https://gist.github.com/yaseminn/a22808a9a69a95fbf741>. I tried to
>> change the way from s3n to s3a but nothing change, different errors comes.
>>
>> *reading path*
>> s3n://tani-online/weblog/
>>
>> *model saving path*
>> s3n://tani-online/model/
>>
>> *configuration*
>>
>> sc.hadoopConfiguration().set("fs.s3.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem");
>> sc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", AWS_ACCESS_KEY_ID);
>> sc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey",
>> AWS_SECRET_ACCESS_KEY);
>>
>>
>> ps: I am using spark-1.6.0-bin-hadoop2.4
>>
>> Best,
>> yasemin
>>
>> --
>> hiç ender hiç
>>
>>
>
>
> --
> hiç ender hiç
>

Reply via email to