[ https://issues.apache.org/jira/browse/SPARK-3641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14232080#comment-14232080 ]
Michael Armbrust commented on SPARK-3641: ----------------------------------------- This has been fixed for a while and will be in Spark 1.2 and there are nearly final RCs already available. Running any other spark query that does not use applySchema in the same thread should also work around the issue. > Correctly populate SparkPlan.currentContext > ------------------------------------------- > > Key: SPARK-3641 > URL: https://issues.apache.org/jira/browse/SPARK-3641 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.1.0 > Reporter: Yin Huai > Assignee: Michael Armbrust > Priority: Critical > Fix For: 1.2.0 > > > After creating a new SQLContext, we need to populate SparkPlan.currentContext > before we create any SparkPlan. Right now, only SQLContext.createSchemaRDD > populate SparkPlan.currentContext. SQLContext.applySchema is missing this > call and we can have NPE as described in > http://qnalist.com/questions/5162981/spark-sql-1-1-0-npe-when-join-two-cached-table. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org