nsivabalan commented on a change in pull request #4253:
URL: https://github.com/apache/hudi/pull/4253#discussion_r768950864



##########
File path: 
hudi-spark-datasource/hudi-spark2/src/main/java/org/apache/hudi/internal/DefaultSource.java
##########
@@ -62,10 +68,19 @@ public DataSourceReader createReader(DataSourceOptions 
options) {
     String instantTime = 
options.get(DataSourceInternalWriterHelper.INSTANT_TIME_OPT_KEY).get();
     String path = options.get("path").get();
     String tblName = options.get(HoodieWriteConfig.TBL_NAME.key()).get();
+    Map<String, String> parameters = options.asMap();
     boolean populateMetaFields = 
options.getBoolean(HoodieTableConfig.POPULATE_META_FIELDS.key(),
         
Boolean.parseBoolean(HoodieTableConfig.POPULATE_META_FIELDS.defaultValue()));
+    // Now by default ParquetWriteSupport will write DecimalType to parquet as 
int32/int64 when the scale of decimalType < Decimal.MAX_LONG_DIGITS(),

Review comment:
       Can we move this to spark-common rather than duplicating across spark2 
and spark3? 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to