rdblue commented on issue #25330: [SPARK-28565][SQL] DataFrameWriter 
saveAsTable support for V2 catalogs
URL: https://github.com/apache/spark/pull/25330#issuecomment-517806592
 
 
   @cloud-fan, I think we are saying the same thing. The identifier supplied by 
the user determines the catalog that should be used (considering session 
catalog and v2 session catalog to be the same catalog). Then v2 or v1 session 
catalog is used depending on what the provider implementation needs.
   
   We already do this for `CREATE TABLE` SQL: 
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala#L51-L72
   
   That first rule only matches if there is no v2 catalog (not set in the 
identifier and no default), and if the provider is a v1 provider. The second 
rule matches all cases where v2 will be used. `CatalogObjectIdentifier` returns 
a catalog if there is an explicit catalog or a default. If that is `None`, then 
the session catalog is responsible for the identifier and because the first 
rule didn't match, we know that the provider is v2, so the v2 session catalog 
is used.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to