Hi Vinay,

Flink's file systems are self contained and won't respect the core-site.xml
if I'm not mistaken. Instead you have to set the credentials in the flink
configuration flink-conf.yaml via `fs.s3a.access.key: access_key`,
`fs.s3a.secret.key: secret_key` and so on [1]. Have you tried this out?

This has been fixed with Flink 1.6.2 and 1.7.0 [2].

[1]
https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems.html#built-in-file-systems
[2] https://issues.apache.org/jira/browse/FLINK-10383

Cheers,
Till

On Wed, Jan 16, 2019 at 10:10 AM Kostas Kloudas <k.klou...@da-platform.com>
wrote:

> Hi Taher,
>
> So you are using the same configuration files and everything and the only
> thing you change is the "s3://" to "s3a://" and the sink cannot find the
> credentials?
> Could you please provide the logs of the Task Managers?
>
> Cheers,
> Kostas
>
> On Wed, Jan 16, 2019 at 9:13 AM Dawid Wysakowicz <dwysakow...@apache.org>
> wrote:
>
>> Forgot to cc ;)
>> On 16/01/2019 08:51, Vinay Patil wrote:
>>
>> Hi,
>>
>> Can someone please help on this issue. We have even tried to set
>> fs.s3a.impl in core-site.xml, still its not working.
>>
>> Regards,
>> Vinay Patil
>>
>>
>> On Fri, Jan 11, 2019 at 5:03 PM Taher Koitawala [via Apache Flink User
>> Mailing List archive.] <ml+s2336050n25464...@n4.nabble.com> wrote:
>>
>>> Hi All,
>>>          We have implemented S3 sink in the following way:
>>>
>>> StreamingFileSink sink= StreamingFileSink.forBulkFormat(new
>>> Path("s3a://mybucket/myfolder/output/"),
>>> ParquetAvroWriters.forGenericRecord(schema))
>>> .withBucketCheckInterval(50l).withBucketAssigner(new
>>> CustomBucketAssigner()).build();
>>>
>>> The problem we are facing is that StreamingFileSink is initializing
>>> S3AFileSystem class to write to s3 and is not able to find the s3
>>> credentials to write data, However other flink application on the same
>>> cluster use "s3://" paths are able to write data to the same s3 bucket and
>>> folders, we are only facing this issue with StreamingFileSink.
>>>
>>> Regards,
>>> Taher Koitawala
>>> GS Lab Pune
>>> +91 8407979163
>>>
>>>
>>> ------------------------------
>>> If you reply to this email, your message will be added to the discussion
>>> below:
>>>
>>> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/StreamingFileSink-cannot-get-AWS-S3-credentials-tp25464.html
>>> To start a new topic under Apache Flink User Mailing List archive.,
>>> email ml+s2336050n1...@n4.nabble.com
>>> To unsubscribe from Apache Flink User Mailing List archive., click here
>>> <http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1&code=dmluYXkxOC5wYXRpbEBnbWFpbC5jb218MXwxODExMDE2NjAx>
>>> .
>>> NAML
>>> <http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>>>
>>

Reply via email to