[ https://issues.apache.org/jira/browse/SPARK-17767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15543319#comment-15543319 ]
Dongjoon Hyun commented on SPARK-17767: --------------------------------------- It seems that custom external catalog supporting becomes simpler in Spark 2.1.0 after SPARK-17190. For the external catalog based on Hive classes, we can override easily by parameterizing some private string variable into SQLConf. I'll make a PR for a first attempt. > Spark SQL ExternalCatalog API custom implementation support > ----------------------------------------------------------- > > Key: SPARK-17767 > URL: https://issues.apache.org/jira/browse/SPARK-17767 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.0.0 > Reporter: Alex Liu > > There is no way/easy way to configure Spark to use customized > ExternalCatalog. Internal source code is hardcoded to use either hive or > in-memory metastore. Spark SQL thriftserver is hardcoded to use > HiveExternalCatalog. We should be able to create a custom external catalog > and thriftserver should be able to use it. Potentially Spark SQL thriftserver > shouldn't depend on Hive thriftserer. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org