[ 
https://issues.apache.org/jira/browse/SPARK-3641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14231964#comment-14231964
 ] 

Kapil Malik commented on SPARK-3641:
------------------------------------

Hi all,

Is this expected to be fixed with Spark 1.2.0 release ?
Any work around that I can use till then ? A simple join on 2 cached tables is 
a pretty regular use case. Can I do something to avoid the NPE for that?

Regards,

Kapil

> Correctly populate SparkPlan.currentContext
> -------------------------------------------
>
>                 Key: SPARK-3641
>                 URL: https://issues.apache.org/jira/browse/SPARK-3641
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: Yin Huai
>            Assignee: Michael Armbrust
>            Priority: Critical
>
> After creating a new SQLContext, we need to populate SparkPlan.currentContext 
> before we create any SparkPlan. Right now, only SQLContext.createSchemaRDD 
> populate SparkPlan.currentContext. SQLContext.applySchema is missing this 
> call and we can have NPE as described in 
> http://qnalist.com/questions/5162981/spark-sql-1-1-0-npe-when-join-two-cached-table.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to