[ https://issues.apache.org/jira/browse/SPARK-19970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-19970: ---------------------------------- Description: In the kerberized hadoop cluster, when Spark creates tables, the owner of tables are filled with PRINCIPAL strings instead of USER names. This is inconsistent with Hive and causes problems when using ROLE in Hive. We had better to fix this. *BEFORE* {code} scala> sql("create table t(a int)").show scala> sql("desc formatted t").show(false) ... |Owner: |sp...@example.com | | {code} *AFTER* {code} scala> sql("create table t(a int)").show scala> sql("desc formatted t").show(false) ... |Owner: |spark | | {code} was:In the kerberized hadoop cluster, when Spark creates tables, the owner of tables are filled with PRINCIPAL strings instead of USER names. This is inconsistent with Hive and causes problems when using ROLE in Hive. We had better to fix this. > Table owner should be USER instead of PRINCIPAL in kerberized clusters > ---------------------------------------------------------------------- > > Key: SPARK-19970 > URL: https://issues.apache.org/jira/browse/SPARK-19970 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.1.0 > Reporter: Dongjoon Hyun > > In the kerberized hadoop cluster, when Spark creates tables, the owner of > tables are filled with PRINCIPAL strings instead of USER names. This is > inconsistent with Hive and causes problems when using ROLE in Hive. We had > better to fix this. > *BEFORE* > {code} > scala> sql("create table t(a int)").show > scala> sql("desc formatted t").show(false) > ... > |Owner: |sp...@example.com > | | > {code} > *AFTER* > {code} > scala> sql("create table t(a int)").show > scala> sql("desc formatted t").show(false) > ... > |Owner: |spark | > | > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org