Github user xuanyuanking commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19941#discussion_r156659744
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/InsertIntoDataSourceDirCommand.scala
 ---
    @@ -67,8 +67,9 @@ case class InsertIntoDataSourceDirCommand(
     
         val saveMode = if (overwrite) SaveMode.Overwrite else 
SaveMode.ErrorIfExists
         try {
    -      
sparkSession.sessionState.executePlan(dataSource.planForWriting(saveMode, 
query))
    -      dataSource.writeAndRead(saveMode, query)
    --- End diff --
    
    I implemented like this at first, but after checking this patch 
(https://github.com/apache/spark/pull/18064/files), I changed to current 
implementation, is the wrapping of execution id unnecessary here?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to