You should be able to follow this:

http://mail-archives.apache.org/mod_mbox/drill-user/201512.mbox/%3CCAAL5oQJQRgqO5LjhG_=YFLyHuZUNqEvm3VX3C=2d9uxnbto...@mail.gmail.com%3E

It's similar to the AWS S3 config
(https://ci.apache.org/projects/flink/flink-docs-master/setup/aws.html).

Add the Azure JARs to Flink (add them to the lib folder), configure
the fs.hdfs.hadoopconf key to point to your Hadoop config directory,
and update the core-site.xml like in the mailing list Thread.

Then you should be able to access your data via azure://...

Would appreciate some feedback whether this works as expected.


On Tue, Aug 16, 2016 at 2:37 PM, MIkkel Islay <my.inputstr...@gmail.com> wrote:
> Hello,
>
> I would like to access data in Azure blob storage from Flink, via the Azure
> storage HDFS-compatibility interface.
> That is feasible from Apache Drill, and I am thinking something similar
> should be doable from Flink. A documentation page on eternal storage
> connectors for Flink exist, but it was written pre 1.0.
> Does anyone have experience with setting up a Azure blob connector?
>
> Mikkel

Reply via email to