anishek commented on a change in pull request #951: HIVE-22997 : Copy external 
table to target during Repl Dump operation
URL: https://github.com/apache/hive/pull/951#discussion_r395586086
 
 

 ##########
 File path: ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplDumpTask.java
 ##########
 @@ -662,15 +696,37 @@ void dumpTable(String dbName, String tblName, String 
validTxnList, Path dbRoot,
     replLogger.tableLog(tblName, tableSpec.tableHandle.getTableType());
     if (tableSpec.tableHandle.getTableType().equals(TableType.EXTERNAL_TABLE)
             || Utils.shouldDumpMetaDataOnly(conf)) {
-      return;
+      return Collections.EMPTY_LIST;
     }
-    for (ReplPathMapping replPathMapping: replPathMappings) {
-      Task<?> copyTask = ReplCopyTask.getLoadCopyTask(
-              tuple.replicationSpec, replPathMapping.getSrcPath(), 
replPathMapping.getTargetPath(), conf, false);
-      this.addDependentTask(copyTask);
-      LOG.info("Scheduled a repl copy task from [{}] to [{}]",
-              replPathMapping.getSrcPath(), replPathMapping.getTargetPath());
+    return replPathMappings;
+  }
+
+  private void intitiateDataCopyTasks() throws SemanticException {
+    Iterator<ExternalTableCopyTaskBuilder.DirCopyWork> extCopyWorkItr = 
work.getDirCopyIterator();
+    List<Task<?>> childTasks = new ArrayList<>();
+    int maxTasks = 
conf.getIntVar(HiveConf.ConfVars.REPL_APPROX_MAX_LOAD_TASKS);
+    TaskTracker taskTracker = new TaskTracker(maxTasks);
+    while (taskTracker.canAddMoreTasks() && hasMoreCopyWork()) {
+      if (work.replPathIteratorInitialized() && extCopyWorkItr.hasNext()) {
+        childTasks.addAll(new ExternalTableCopyTaskBuilder(work, 
conf).tasks(taskTracker));
+      } else {
+        childTasks.addAll(ReplPathMapping.tasks(work, taskTracker, conf));
+      }
+    }
+    if (!childTasks.isEmpty()) {
+      DAGTraversal.traverse(childTasks, new 
AddDependencyToLeaves(TaskFactory.get(work, conf)));
+    } else {
+      prepareReturnValues(work.getResultValues());
+      childTasks.add(TaskFactory.get(new 
ReplOperationCompleteAckWork(work.getDumpAckFile()), conf));
     }
 
 Review comment:
   why is ack work only if the child task is empty, there is possibility of 
adding the ack work at the end of all the dag, if the dag can contain more 
nodes. may be need to move it out of else clause and do it based on 
tracker.canAddMoreTasks or something?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to