Github user gengliangwang commented on a diff in the pull request: https://github.com/apache/spark/pull/21878#discussion_r205348403 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala --- @@ -635,12 +637,6 @@ object DataSource extends Logging { "Hive built-in ORC data source must be used with Hive support enabled. " + "Please use the native ORC data source by setting 'spark.sql.orc.impl' to " + "'native'") - } else if (provider1.toLowerCase(Locale.ROOT) == "avro" || - provider1 == "com.databricks.spark.avro") { - throw new AnalysisException( --- End diff -- No, I mean by default the avro package is not loaded. E.g. If we start spark-shell without loading the jar, then it will show error "Failed to find data source: avro. Please find an Avro package at http://spark.apache.org/third-party-projects.html".
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org