[ https://issues.apache.org/jira/browse/SPARK-11248?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15397882#comment-15397882 ]
Furcy Pin commented on SPARK-11248: ----------------------------------- +1 I'm trying on spark 2.0.0, and I've configured hive.exec.stagingdir=/tmp/spark-staging/spark-staging then the spark thrift server creates temporary files here: /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000/_temporary/0/task_201607281730_0015_m_000000 with the following rights: {code} drwxrwxrwx spark:spark /tmp/spark-staging drwxrwxrwx user:spark /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9 drwxrwxr x user:spark /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000 drwxrwxr x user:spark /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000/_temporary/ drwxrwxr x user:spark /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000/_temporary/0 drwxrwxr x spark:spark /tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000/_temporary/0/task_201607281730_0015_m_000000 {code} I then get an error when trying to move the staging files to the hive table. org.apache.hadoop.security.AccessControlException: Permission denied: user=user, access=WRITE, inode="/tmp/spark-staging/spark-staging_hive_2016-07-28_17-30-40_226_8664570533244973761-9/-ext-10000/_temporary/0/task_201607281730_0015_m_000000":spark:spark:drwxrwxr-x I tried setting the property "hive.server2.enable.doAs" to true or false, but it didn't seem to change anything. > Spark hivethriftserver is using the wrong user to while getting HDFS > permissions > -------------------------------------------------------------------------------- > > Key: SPARK-11248 > URL: https://issues.apache.org/jira/browse/SPARK-11248 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.0, 1.5.1 > Reporter: Trystan Leftwich > > While running spark as a hivethrift-server via Yarn Spark will use the user > running the Hivethrift server rather than the user connecting via JDBC to > check HDFS perms. > i.e. > In HDFS the perms are > rwx------ 3 testuser testuser /user/testuser/table/testtable > And i connect via beeline as user testuser > beeline -u 'jdbc:hive2://localhost:10511' -n 'testuser' -p '' > If i try to hit that table > select count(*) from test_table; > I get the following error > Error: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch > table test_table. java.security.AccessControlException: Permission denied: > user=hive, access=READ, > inode="/user/testuser/table/testtable":testuser:testuser:drwxr-x--x > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6795) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6777) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6702) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:9529) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1516) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1433) > at > org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) > (state=,code=0) > I have the following in set in hive-site.xml so it should be using the > correct user. > <property> > <name>hive.server2.enable.doAs</name> > <value>true</value> > </property> > <property> > <name>hive.metastore.execute.setugi</name> > <value>true</value> > </property> > > This works correctly in hive. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org