[ https://issues.apache.org/jira/browse/SPARK-41099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17632048#comment-17632048 ]
Bo Zhang edited comment on SPARK-41099 at 11/11/22 3:08 AM: ------------------------------------------------------------ To keep the exceptions exposed to users who use the RDD APIs, we will not change this. See https://github.com/apache/spark/pull/38602#issuecomment-1310755154 was (Author: bozhang): To keep the exceptions exposed to users who use the RDD APIs, we will not change this. > Do not wrap exceptions thrown in SparkHadoopWriter.write with SparkException > ---------------------------------------------------------------------------- > > Key: SPARK-41099 > URL: https://issues.apache.org/jira/browse/SPARK-41099 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.4.0 > Reporter: Bo Zhang > Priority: Major > > This is similar to https://issues.apache.org/jira/browse/SPARK-40488. > Exceptions thrown in SparkHadoopWriter.write are wrapped with > SparkException("Job aborted."). > This wrapping provides little extra information, but generates a long > stacktrace, which hinders debugging when error happens. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org