Github user squito commented on the pull request:

    https://github.com/apache/spark/pull/7028#issuecomment-115413847
  
    This will be great to add that to the exception msg, the current Driver 
stacktrace is not very useful.  But, I think just munging it with the existing 
stack trace might be really confusing to spark users.  Java stack traces have 
very standard interpretations (part of the reason they are so useful).  Eg., if 
I saw this line `at ===== Job Submission =====.(Native Method)` I would just 
probably naively assume spark was calling some magical native method, but that 
it was still a normal call stack (and I wouldn't think I need to look in the 
docs to interpret a stack trace).
    
    How about instead just making it a separate section in the msg?  eg.
    
    ```
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 
in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 
(TID 0, localhost): java.lang.RuntimeException: uh-oh!
        at 
org.apache.spark.scheduler.DAGSchedulerSuite$$anonfun$33$$anonfun$34$$anonfun$apply$mcJ$sp$1.apply(DAGSchedulerSuite.scala:851)
    ...
    Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1285)
    ...
    Job Submission stacktrace:
        at org.apache.spark.rdd.RDD.count(RDD.scala:1095)
    ...
    ```
    
    Then you could also just do this in `JobWaiter` or `sc.runJob`, which would 
make it much simpler.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to