cloud-fan commented on code in PR #36418:
URL: https://github.com/apache/spark/pull/36418#discussion_r869780807


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/Catalogs.scala:
##########
@@ -39,13 +39,17 @@ private[sql] object Catalogs {
    * @param conf a SQLConf
    * @return an initialized CatalogPlugin
    * @throws CatalogNotFoundException if the plugin class cannot be found
-   * @throws org.apache.spark.SparkException           if the plugin class 
cannot be instantiated
+   * @throws org.apache.spark.SparkException if the plugin class cannot be 
instantiated
    */
   @throws[CatalogNotFoundException]
   @throws[SparkException]
   def load(name: String, conf: SQLConf): CatalogPlugin = {
     val pluginClassName = try {
-      conf.getConfString("spark.sql.catalog." + name)

Review Comment:
   After a second thought, this name limitation is only for the current 
config-based way to register catalogs. Now I think your first version is 
better: we only need to forbid dot in the catalog name. Maybe in the future, we 
add new ways to register catalogs and have different limitations.
   
   Can you change back to the first version? Sorry for the back and forth!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to