[ 
https://issues.apache.org/jira/browse/SPARK-47899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Arjun Sahoo updated SPARK-47899:
--------------------------------
    Description: 
As part of SPARK-39195, task is marked as failed but the exception chain was 
not sent, ultimately the cause becomes `null` in SparkException. It is not 
convenient to find the root cause from the detailed message.

{code}
  /**
   * Called by the OutputCommitCoordinator to cancel stage due to data 
duplication may happen.
   */
  private[scheduler] def stageFailed(stageId: Int, reason: String): Unit = {
    eventProcessLoop.post(StageFailed(stageId, reason, None))
  }
{code}

  was:As part of SPARK-39195, task is marked as failed but the exception chain 
was not sent, ultimately the cause becomes `null` in SparkException. 
Applications unable to get the root cause as the cause is null.


> StageFailed event should attach the exception chain
> ---------------------------------------------------
>
>                 Key: SPARK-47899
>                 URL: https://issues.apache.org/jira/browse/SPARK-47899
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.4.0
>            Reporter: Arjun Sahoo
>            Assignee: BingKun Pan
>            Priority: Minor
>
> As part of SPARK-39195, task is marked as failed but the exception chain was 
> not sent, ultimately the cause becomes `null` in SparkException. It is not 
> convenient to find the root cause from the detailed message.
> {code}
>   /**
>    * Called by the OutputCommitCoordinator to cancel stage due to data 
> duplication may happen.
>    */
>   private[scheduler] def stageFailed(stageId: Int, reason: String): Unit = {
>     eventProcessLoop.post(StageFailed(stageId, reason, None))
>   }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to