[ https://issues.apache.org/jira/browse/SPARK-3641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14143347#comment-14143347 ]
Yin Huai commented on SPARK-3641: --------------------------------- [~marmbrus] Can we populate SparkPlan.currentContext in the constructor of SQLContext instead of populate it every time before using ExistingRDD? > Correctly populate SparkPlan.currentContext > ------------------------------------------- > > Key: SPARK-3641 > URL: https://issues.apache.org/jira/browse/SPARK-3641 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.1.0 > Reporter: Yin Huai > Priority: Critical > > After creating a new SQLContext, we need to populate SparkPlan.currentContext > before we create any SparkPlan. Right now, only SQLContext.createSchemaRDD > populate SparkPlan.currentContext. SQLContext.applySchema is missing this > call and we can have NPE as described in > http://qnalist.com/questions/5162981/spark-sql-1-1-0-npe-when-join-two-cached-table. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org