Hi Steve,
I don't think I fully understand your answer. Please pardon my naiveness
regarding the subject. From what I understand, the actual read will happen
in the executor so executor needs access to data lake. In that sense, how
do I make sure that I can programmatically pass azure credentials
This might help; I’ve built a REST API with livyServer:
https://livy.incubator.apache.org/
From: Steve Loughran
Date: Saturday, August 19, 2017 at 7:05 AM
To: Imtiaz Ahmed
Cc: "user@spark.apache.org"
Subject: Re: How to authenticate to ADLS from within spark job on the fly
On 1
On 19 Aug 2017, at 02:42, Imtiaz Ahmed
mailto:emtiazah...@gmail.com>> wrote:
Hi All,
I am building a spark library which developers will use when writing their
spark jobs to get access to data on Azure Data Lake. But the authentication
will depend on the dataset they ask for. I need to call
It may not be as easy as you think. The rest call will happen in driver but
the reads will be in the executors.
On Sat, 19 Aug 2017 at 11:42 am, Imtiaz Ahmed wrote:
> Hi All,
>
> I am building a spark library which developers will use when writing their
> spark jobs to get access to data on Azur
Hi All,
I am building a spark library which developers will use when writing their
spark jobs to get access to data on Azure Data Lake. But the authentication
will depend on the dataset they ask for. I need to call a rest API from
within spark job to get credentials and authenticate to read data f