Hi Kwangsun,

You may use `—proxy-user` to impersonate.

For example,

bin/spark-shell --proxy-user kent
21/04/12 23:31:34 WARN Utils: Your hostname, hulk.local resolves to a
loopback address: 127.0.0.1; using 192.168.1.14 instead (on interface en0)
21/04/12 23:31:34 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
21/04/12 23:31:34 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
Spark context Web UI available at http://192.168.1.14:4040
Spark context available as 'sc' (master = local[*], app id =
local-1618241499136).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.2.0-SNAPSHOT
      /_/

Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_251)
Type in expressions to have them evaluated.
Type :help for more information.

scala> spark.sparkContext.sparkUser
res0: String = kent

scala>
org.apache.hadoop.security.UserGroupInformation.getCurrentUser.getShortUserName
res1: String = kent

scala>
org.apache.hadoop.security.UserGroupInformation.getLoginUser.getShortUserName
res2: String = kentyao

Bests,

Kent Yao



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to