sreeram26 commented on a change in pull request #2014:
URL: https://github.com/apache/hudi/pull/2014#discussion_r475306551



##########
File path: hudi-spark/src/main/java/org/apache/hudi/DataSourceUtils.java
##########
@@ -248,15 +249,18 @@ public static HoodieWriteClient 
createHoodieClient(JavaSparkContext jssc, String
   }
 
   public static JavaRDD<WriteStatus> doWriteOperation(HoodieWriteClient 
client, JavaRDD<HoodieRecord> hoodieRecords,
-      String instantTime, String operation) throws HoodieException {
-    if 
(operation.equals(DataSourceWriteOptions.BULK_INSERT_OPERATION_OPT_VAL())) {
+      String instantTime, WriteOperationType operation) throws HoodieException 
{
+    if (operation == WriteOperationType.BULK_INSERT) {
       Option<BulkInsertPartitioner> userDefinedBulkInsertPartitioner =
           createUserDefinedBulkInsertPartitioner(client.getConfig());
       return client.bulkInsert(hoodieRecords, instantTime, 
userDefinedBulkInsertPartitioner);
-    } else if 
(operation.equals(DataSourceWriteOptions.INSERT_OPERATION_OPT_VAL())) {
+    } else if (operation == WriteOperationType.INSERT) {
       return client.insert(hoodieRecords, instantTime);
     } else {
       // default is upsert
+      if (operation != WriteOperationType.UPSERT) {

Review comment:
       Not throwing an explicit error here, the only other value it can 
potentially have is Bootstrap based on the enum, the issue which exposed this 
issue would have thrown an exception on WriteOperationType.fromValue itself.
   
   Can change to throw a HoodieException, if the reviewer feels that is 
necessary




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to