weiwenda created SPARK-30617:
--------------------------------

             Summary: Is there any possible that spark no longer restrict 
enumerate types of spark.sql.catalogImplementation
                 Key: SPARK-30617
                 URL: https://issues.apache.org/jira/browse/SPARK-30617
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.4.4
            Reporter: weiwenda
             Fix For: 3.1.0, 2.4.6


# We have implemented a complex ExternalCatalog which is used for retrieving 
multi isomerism database's metadata(sush as elasticsearch、postgresql), so that 
we can make a mixture query between hive and our online data.
 # But as spark require that value of spark.sql.catalogImplementation must be 
one of in-memory/hive, we have to modify SparkSession and rebuild spark to make 
our project work.
 # Finally, we hope spark removing above restriction, so that it's will be much 
easier to let us keep pace with new spark version. Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to