marchpure commented on a change in pull request #3793:
URL: https://github.com/apache/carbondata/pull/3793#discussion_r442954765



##########
File path: 
integration/spark/src/main/scala/org/apache/spark/sql/execution/command/mutation/merge/CarbonMergeDataSetCommand.scala
##########
@@ -269,11 +271,10 @@ case class CarbonMergeDataSetCommand(
       new SparkCarbonFileFormat().prepareWrite(sparkSession, job,
         Map(), schema)
     val config = SparkSQLUtil.broadCastHadoopConf(sparkSession.sparkContext, 
job.getConfiguration)
-    
(frame.rdd.coalesce(DistributionUtil.getConfiguredExecutors(sparkSession.sparkContext)).
-      mapPartitionsWithIndex { case (index, iter) =>
+    (frame.rdd.mapPartitionsWithIndex { case (index, iter) =>
         CarbonProperties.getInstance().addProperty(CarbonLoadOptionConstants
           .ENABLE_CARBON_LOAD_DIRECT_WRITE_TO_STORE_PATH, "true")
-        val confB = config.value.value
+        val confB = new Configuration(config.value.value)

Review comment:
       Without this change. The CI throws the exceptions (checked in serveral 
different envs):
   java.lang.RuntimeException: Store location not set for the key 
__temptable-f8c5bbbf-0b73-4288-9438-146283d442c0_1592586283728_null_e8112868-3cf4-442d-bf9c-05f90bfca8240x0
        at 
org.apache.carbondata.processing.loading.TableProcessingOperations.deleteLocalDataLoadFolderLocation(TableProcessingOperations.java:125)
   
   
   The root cause is 
   the line: "val context = new TaskAttemptContextImpl(**confB**, attemptID)"
   different context will have the same taskno, which is strange.
   
   With this line, the exception disappears, CI passed.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to