Thanks!
I finally make this work, except parameter LinuxContainerExecutor and
cache directory permissions, the following parameter also need to be
updated to specified user.
yarn.nodemanager.linux-container-executor.nonsecure-mode.local-user
Thanks.
2018-01-22 22:44 GMT+08:00 Margusja :
> Hi
>
Hi
org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor requires user
in each node and right permissions set in necessary directories.
Br
Margus
> On 22 Jan 2018, at 13:41, sd wang wrote:
>
> org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor
Configure Kerberos
> On 22. Jan 2018, at 08:28, sd wang wrote:
>
> Hi Advisers,
> When submit spark job in yarn cluster mode, the job will be executed by
> "yarn" user. Any parameters can change the user? I tried setting
> HADOOP_USER_NAME but it did not work. I'm using spark 2.2.
> Thanks fo
Hi Margus,
Appreciate your help!
Seems this parameter is related to CGroups functions.
I am using CDH without kerberos, I set the parameter:
yarn.nodemanager.container-executor.class=org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor
Then run spark job again, hit the problem as below
Hi
One way to get it is use YARN configuration parameter -
yarn.nodemanager.container-executor.class.
By default it is
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor
org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor - gives you
user who executes script.
Br
Hi Advisers,
When submit spark job in yarn cluster mode, the job will be executed by
"yarn" user. Any parameters can change the user? I tried
setting HADOOP_USER_NAME but it did not work. I'm using spark 2.2.
Thanks for any help!