[ 
https://issues.apache.org/jira/browse/SPARK-30617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17022676#comment-17022676
 ] 

Dongjoon Hyun commented on SPARK-30617:
---------------------------------------

Thanks, [~994184...@qq.com]. I added the existing JIRAs. I'd like to recommend 
the followings according to the community guide.
- https://spark.apache.org/contributing.html

1. Please don't set `Fix Versions`. That is used by the committer when the PR 
is merged finally.
2. For `Affected Version`, please set the master branch version number for the 
new feature JIRA. (For now, it's 3.0.0.) Since Apache Spark allows bug-fix 
backporting only, there is no way to affect released versions.
3. If possible, please search before creating a JIRA. Usually, people think in 
the similar ways.

> Is there any possible that spark no longer restrict enumerate types of 
> spark.sql.catalogImplementation
> ------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-30617
>                 URL: https://issues.apache.org/jira/browse/SPARK-30617
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: weiwenda
>            Priority: Minor
>
> # We have implemented a complex ExternalCatalog which is used for retrieving 
> multi isomerism database's metadata(sush as elasticsearch、postgresql), so 
> that we can make a mixture query between hive and our online data.
>  # But as spark require that value of spark.sql.catalogImplementation must be 
> one of in-memory/hive, we have to modify SparkSession and rebuild spark to 
> make our project work.
>  # Finally, we hope spark removing above restriction, so that it's will be 
> much easier to let us keep pace with new spark version. Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to