pvary commented on code in PR #3362:
URL: https://github.com/apache/hive/pull/3362#discussion_r907320335


##########
iceberg/iceberg-handler/src/main/java/org/apache/iceberg/mr/hive/HiveIcebergStorageHandler.java:
##########
@@ -858,19 +861,25 @@ private static boolean 
hasParquetListColumnSupport(Properties tableProps, Schema
    * @param overwrite If we have to overwrite the existing table or just add 
the new data
    * @return The generated JobContext
    */
-  private Optional<JobContext> generateJobContext(Configuration configuration, 
String tableName, boolean overwrite) {
+  private Optional<List<JobContext>> generateJobContext(Configuration 
configuration, String tableName,
+      boolean overwrite) {
     JobConf jobConf = new JobConf(configuration);
-    Optional<SessionStateUtil.CommitInfo> commitInfo = 
SessionStateUtil.getCommitInfo(jobConf, tableName);
-    if (commitInfo.isPresent()) {
-      JobID jobID = JobID.forName(commitInfo.get().getJobIdStr());
-      commitInfo.get().getProps().forEach(jobConf::set);
-      jobConf.setBoolean(InputFormatConfig.IS_OVERWRITE, overwrite);
-
-      // we should only commit this current table because
-      // for multi-table inserts, this hook method will be called sequentially 
for each target table
-      jobConf.set(InputFormatConfig.OUTPUT_TABLES, tableName);
-
-      return Optional.of(new JobContextImpl(jobConf, jobID, null));
+    Optional<Map<String, SessionStateUtil.CommitInfo>> commitInfoMap =
+        SessionStateUtil.getCommitInfo(jobConf, tableName);
+    if (commitInfoMap.isPresent()) {
+      List<JobContext> jobContextList = Lists.newLinkedList();
+      for (SessionStateUtil.CommitInfo commitInfo : 
commitInfoMap.get().values()) {
+        JobID jobID = JobID.forName(commitInfo.getJobIdStr());
+        commitInfo.getProps().forEach(jobConf::set);
+        jobConf.setBoolean(InputFormatConfig.IS_OVERWRITE, overwrite);
+
+        // we should only commit this current table because
+        // for multi-table inserts, this hook method will be called 
sequentially for each target table
+        jobConf.set(InputFormatConfig.OUTPUT_TABLES, tableName);
+
+        jobContextList.add(new JobContextImpl(jobConf, jobID, null));
+      }
+      return Optional.of(jobContextList);

Review Comment:
   Why not empty list instead of Optional?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: gitbox-unsubscr...@hive.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: gitbox-unsubscr...@hive.apache.org
For additional commands, e-mail: gitbox-h...@hive.apache.org

Reply via email to