You can try the "--proxy-user" command line argument for spark-submit.
That requires that your RM configuration allows the user running your
AM to "proxy" other users. And I'm not completely sure it works
without Kerberos.

See: 
https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html

On Mon, Nov 2, 2015 at 8:02 AM, Peter Rudenko <petro.rude...@gmail.com> wrote:
> Hi, i have an ApplicationMaster which accepts requests and launches
> container on which it launches spark-submit --master yarn. In request i have
> a field "username" - the user i want to laucnh a job from. How can i set a
> user which will be run conmmand on a container? Currently they all running
> as yarn user even though AM is running as a root user.
>
> Here's my code:
>
>
> private def setupTokens(username: String): ByteBuffer = {
>   val credentials = UserGroupInformation.createProxyUser(username,
> UserGroupInformation.getCurrentUser).getCredentials
>   val dob = new DataOutputBuffer();
>   credentials.writeTokenStorageToStream(dob);
>   ByteBuffer.wrap(dob.getData(), 0, dob.getLength()).duplicate();
> }
>
> val cCLC = Records.newRecord(classOf[ContainerLaunchContext])
>
> cCLC.setCommands(List("spark-submit --master yarn ..."))
>
> cCLC.setTokens(setupTokens(user))
>
> Thanks, Peter Rudenko



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to