HyukjinKwon commented on code in PR #44305:
URL: https://github.com/apache/spark/pull/44305#discussion_r1427567309


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala:
##########
@@ -105,13 +106,14 @@ case class DataSource(
     // [[FileDataSourceV2]] will still be used if we call the load()/save() 
method in
     // [[DataFrameReader]]/[[DataFrameWriter]], since they use method 
`lookupDataSource`
     // instead of `providingClass`.
-    cls.getDeclaredConstructor().newInstance() match {
+    DataSource.newDataSourceInstance(className, cls) match {
       case f: FileDataSourceV2 => f.fallbackFileFormat

Review Comment:
   and `tables.scala` as well:
   
   ```scala
       if (DDLUtils.isDatasourceTable(catalogTable)) {
         DataSource.newDataSourceInstance(
             catalogTable.provider.get,
             DataSource.lookupDataSource(catalogTable.provider.get, conf)) 
match {
           // For datasource table, this command can only support the following 
File format.
           // TextFileFormat only default to one column "value"
           // Hive type is already considered as hive serde table, so the logic 
will not
           // come in here.
           case _: CSVFileFormat | _: JsonFileFormat | _: ParquetFileFormat =>
           case _: JsonDataSourceV2 | _: CSVDataSourceV2 |
                _: OrcDataSourceV2 | _: ParquetDataSourceV2 =>
           case s if s.getClass.getCanonicalName.endsWith("OrcFileFormat") =>
           case s =>
             throw 
QueryCompilationErrors.alterAddColNotSupportDatasourceTableError(s, table)
         }
       }
       catalogTable
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to