Re: How to submit a job via REST API?

2020-11-24 Thread vaquar khan
Hi Yang,

Please find following link

https://stackoverflow.com/questions/63677736/spark-application-as-a-rest-service/63678337#63678337

Regards,
Vaquar khan

On Wed, Nov 25, 2020 at 12:40 AM Sonal Goyal  wrote:

> You should be able to supply the --conf and its values as part of appArgs
> argument
>
> Cheers,
> Sonal
> Nube Technologies 
> Join me at
> Data Con LA Oct 23 | Big Data Conference Europe. Nov 24 | GIDS AI/ML Dec 3
>
>
>
>
> On Tue, Nov 24, 2020 at 11:31 AM Dennis Suhari 
> wrote:
>
>> Hi Yang,
>>
>> I am using Livy Server for submitting jobs.
>>
>> Br,
>>
>> Dennis
>>
>>
>>
>> Von meinem iPhone gesendet
>>
>> Am 24.11.2020 um 03:34 schrieb Zhou Yang :
>>
>> 
>> Dear experts,
>>
>> I found a convenient way to submit job via Rest API at
>> https://gist.github.com/arturmkrtchyan/5d8559b2911ac951d34a#file-submit_job-sh
>> .
>> But I did not know whether can I append `—conf` parameter like what I did
>> in spark-submit. Can someone can help me with this issue?
>>
>> *Regards, Yang*
>>
>>

-- 
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago


Re: How to submit a job via REST API?

2020-11-24 Thread Sonal Goyal
You should be able to supply the --conf and its values as part of appArgs
argument

Cheers,
Sonal
Nube Technologies 
Join me at
Data Con LA Oct 23 | Big Data Conference Europe. Nov 24 | GIDS AI/ML Dec 3




On Tue, Nov 24, 2020 at 11:31 AM Dennis Suhari 
wrote:

> Hi Yang,
>
> I am using Livy Server for submitting jobs.
>
> Br,
>
> Dennis
>
>
>
> Von meinem iPhone gesendet
>
> Am 24.11.2020 um 03:34 schrieb Zhou Yang :
>
> 
> Dear experts,
>
> I found a convenient way to submit job via Rest API at
> https://gist.github.com/arturmkrtchyan/5d8559b2911ac951d34a#file-submit_job-sh
> .
> But I did not know whether can I append `—conf` parameter like what I did
> in spark-submit. Can someone can help me with this issue?
>
> *Regards, Yang*
>
>


Re: How to apply ranger policies on Spark

2020-11-24 Thread joyan sil
Thanks Ayan and Dennis,

'@Ayan. if I use Ranger to manage HDFS ACLS, as you mentioned it will
coarse grain control over file. I might have few fine grained use cases at
row/column level
I was going through the below JIRAS and thinking if anyone might have used
it and any user documentation for the same exists in the spark community.

https://issues.apache.org/jira/browse/RANGER-2128
https://issues.apache.org/jira/browse/SUBMARINE-409

Regards
Joyan

On Tue, Nov 24, 2020 at 1:40 PM ayan guha  wrote:

> AFAIK, Ranger secures Hive (JDBC) server only. Unfortunately Spark does
> not interact with HS2, but directly interacts with Metastore. Hence, the
> only way to use Ranger policies if you use Hive via JDBC. Another option is
> HDFS or Storage ACLs, which are coarse grain control over file path etc.
> You can use Ranger to manage HDFS ACLs as well. In such scenario spark will
> be bound by those policies.
>
> On Tue, Nov 24, 2020 at 5:26 PM Dennis Suhari 
> wrote:
>
>> Hi Joyan,
>>
>> Spark uses its own metastore. Using Ranger you need to use the Hive
>> Metastore. For this you need to point to Hive Metastore and use HiveContext
>> in your Spark Code.
>>
>> Br,
>>
>> Dennis
>>
>> Von meinem iPhone gesendet
>>
>> Am 23.11.2020 um 19:04 schrieb joyan sil :
>>
>> 
>>
>> Hi,
>>
>> We have ranger policies defined on the hive table and authorization works
>> as expected when we use hive cli and beeline. But when we access those hive
>> tables using spark-shell or spark-submit it does not work.
>>
>>  Any suggestions to make Ranger work with Spark?
>>
>>
>> Regards
>>
>> Joyan
>>
>>
>
> --
> Best Regards,
> Ayan Guha
>


Re: How to apply ranger policies on Spark

2020-11-24 Thread ayan guha
AFAIK, Ranger secures Hive (JDBC) server only. Unfortunately Spark does not
interact with HS2, but directly interacts with Metastore. Hence, the only
way to use Ranger policies if you use Hive via JDBC. Another option is HDFS
or Storage ACLs, which are coarse grain control over file path etc. You can
use Ranger to manage HDFS ACLs as well. In such scenario spark will be
bound by those policies.

On Tue, Nov 24, 2020 at 5:26 PM Dennis Suhari 
wrote:

> Hi Joyan,
>
> Spark uses its own metastore. Using Ranger you need to use the Hive
> Metastore. For this you need to point to Hive Metastore and use HiveContext
> in your Spark Code.
>
> Br,
>
> Dennis
>
> Von meinem iPhone gesendet
>
> Am 23.11.2020 um 19:04 schrieb joyan sil :
>
> 
>
> Hi,
>
> We have ranger policies defined on the hive table and authorization works
> as expected when we use hive cli and beeline. But when we access those hive
> tables using spark-shell or spark-submit it does not work.
>
>  Any suggestions to make Ranger work with Spark?
>
>
> Regards
>
> Joyan
>
>

-- 
Best Regards,
Ayan Guha