Re: run spark job in yarn cluster mode as specified user

2018-01-22 Thread sd wang
Thanks! I finally make this work, except parameter LinuxContainerExecutor and cache directory permissions, the following parameter also need to be updated to specified user. yarn.nodemanager.linux-container-executor.nonsecure-mode.local-user Thanks. 2018-01-22 22:44 GMT+08:00 Margusja

Re: run spark job in yarn cluster mode as specified user

2018-01-22 Thread Margusja
Hi org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor requires user in each node and right permissions set in necessary directories. Br Margus > On 22 Jan 2018, at 13:41, sd wang wrote: > >

Re: run spark job in yarn cluster mode as specified user

2018-01-22 Thread Jörn Franke
Configure Kerberos > On 22. Jan 2018, at 08:28, sd wang wrote: > > Hi Advisers, > When submit spark job in yarn cluster mode, the job will be executed by > "yarn" user. Any parameters can change the user? I tried setting > HADOOP_USER_NAME but it did not work. I'm

Re: run spark job in yarn cluster mode as specified user

2018-01-22 Thread sd wang
Hi Margus, Appreciate your help! Seems this parameter is related to CGroups functions. I am using CDH without kerberos, I set the parameter: yarn.nodemanager.container-executor.class=org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor Then run spark job again, hit the problem as

Re: run spark job in yarn cluster mode as specified user

2018-01-21 Thread Margusja
Hi One way to get it is use YARN configuration parameter - yarn.nodemanager.container-executor.class. By default it is org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor - gives you user who executes script. Br

run spark job in yarn cluster mode as specified user

2018-01-21 Thread sd wang
Hi Advisers, When submit spark job in yarn cluster mode, the job will be executed by "yarn" user. Any parameters can change the user? I tried setting HADOOP_USER_NAME but it did not work. I'm using spark 2.2. Thanks for any help!