Hi folks,

I have a Spark application executing various jobs for different users
simultaneously, via several Spark sessions on several threads.

My customer would like to kerberize his hadoop cluster. I wonder if there is
a way to configure impersonation such as each of these jobs would be ran
with the different proxy users. From what I see in spark conf and code, it's
not possible to do that at runtime for a specific context, but I'm not
familiar with Kerberos nor with this part of Spark.

Anyone can confirm/infirm this ?

Mathieu

(also on S.O
http://stackoverflow.com/questions/43765044/kerberos-impersonation-of-a-spark-context-at-runtime)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Kerberos-impersonation-of-a-Spark-Context-at-runtime-tp28651.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to