[ https://issues.apache.org/jira/browse/SPARK-26794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-26794. --------------------------------- Resolution: Fixed Fix Version/s: 3.0.0 Issue resolved by pull request 23709 [https://github.com/apache/spark/pull/23709] > SparkSession enableHiveSupport does not point to hive but in-memory while the > SparkContext exists > ------------------------------------------------------------------------------------------------- > > Key: SPARK-26794 > URL: https://issues.apache.org/jira/browse/SPARK-26794 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.3.2 > Reporter: Kent Yao > Assignee: Kent Yao > Priority: Major > Fix For: 3.0.0 > > > {code:java} > public class SqlDemo > { > public static void main(final String[] args) throws Exception { > > SparkConf conf = new SparkConf().setAppName("spark-sql-demo"); > JavaSparkContext sc = new JavaSparkContext(conf); > SparkSession ss = > SparkSession.builder().enableHiveSupport().getOrCreate(); > ss.sql("show databases").show(); > } > > } > } > {code} > Before SPARK-20946, the demo above point to the right hive metastore if the > hive-site.xml is present. But now it can only point to the default in-memory > one. > Catalog is now as a variable shared across SparkSessions, it is instantiated > with SparkContext's conf. After SPARK-20946, Session level configs are not > pass to SparkContext's conf anymore, so the enableHiveSupport API takes no > affect on the catalog instance. > You can set spark.sql.catalogImplementation=hive application wide to solve > the problem, or never create a sc before you call > SparkSession.builder().enableHiveSupport().getOrCreate() -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org