Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22134#discussion_r210967474
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala
 ---
    @@ -609,7 +609,13 @@ object DataSource extends Logging {
     
       /** Given a provider name, look up the data source class definition. */
       def lookupDataSource(provider: String, conf: SQLConf): Class[_] = {
    -    val provider1 = backwardCompatibilityMap.getOrElse(provider, provider) 
match {
    +    val customBackwardCompatibilityMap =
    +      conf.getAllConfs
    +        .filter(_._1.startsWith("spark.sql.datasource.map"))
    +        .map{ case (k, v) => (k.replaceFirst("^spark.sql.datasource.map.", 
""), v) }
    +    val compatibilityMap = backwardCompatibilityMap ++ 
customBackwardCompatibilityMap
    --- End diff --
    
    If this is merged, we can remove the internal Avro mapping code and put 
that into documents before branch cut.
    
    > so at this point we are leaving com.databricks.spark.avro -> internal 
avro as the default and users have to set it back to com.databricks.spark.avro 
or do they set it empty?
    
    This will be better than another option like the following `avro` specific 
option.
    ```scala
    val ENABLE_AVRO_BACKWARD_COMPATIBILITY = 
      buildConf("spark.sql.avro.backwardCompatibility")
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to