[ 
https://issues.apache.org/jira/browse/SPARK-5063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14263415#comment-14263415
 ] 

Apache Spark commented on SPARK-5063:
-------------------------------------

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/3884

> Raise more helpful errors when RDD actions or transformations are called 
> inside of transformations
> --------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5063
>                 URL: https://issues.apache.org/jira/browse/SPARK-5063
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>
> Spark does not support nested RDDs or performing Spark actions inside of 
> transformations; this usually leads to NullPointerExceptions (see SPARK-718 
> as one example).  The confusing NPE is one of the most common sources of 
> Spark questions on StackOverflow:
> - 
> https://stackoverflow.com/questions/13770218/call-of-distinct-and-map-together-throws-npe-in-spark-library/14130534#14130534
> - 
> https://stackoverflow.com/questions/23793117/nullpointerexception-in-scala-spark-appears-to-be-caused-be-collection-type/23793399#23793399
> - 
> https://stackoverflow.com/questions/25997558/graphx-ive-got-nullpointerexception-inside-mapvertices/26003674#26003674
> (those are just a sample of the ones that I've answered personally; there are 
> many others).
> I think we can detect these errors by adding logic to {{RDD}} to check 
> whether {{sc}} is null (e.g. turn {{sc}} into a getter function); we can use 
> this to add a better error message.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to