[ https://issues.apache.org/jira/browse/SPARK-10550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14740959#comment-14740959 ]
shao lo commented on SPARK-10550: --------------------------------- I do not know if this is valid Scala, but something like... @transient private[sql] lazy val listener = new SQLListener(this){ sparkContext.addSparkListener(listener) sparkContext.ui.foreach(new SQLTab(this, _)) } What I really need is a way to initialize an extended SQLContext with a reference to an already initialized SQLContext. In 1.4.1 I was able to do the following.. public class MySQLContext extends SQLContext { SparkContext sc; public MySQLContext(JavaSparkContext sc) { super(sc.sc()); this.sc = sc.sc(); } } SparkContext ctx1 = new SparkContext(sparkConf); JavaSparkContext ctx = new JavaSparkContext(ctx1); MySQLContext sqlCtx = new MySQLContext(ctx); > SQLListener error constructing extended SQLContext > --------------------------------------------------- > > Key: SPARK-10550 > URL: https://issues.apache.org/jira/browse/SPARK-10550 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.0 > Reporter: shao lo > Priority: Minor > > With spark 1.4.1 I was able to created a custom SQLContext class. With spark > 1.5.0, I now get an error calling the super class constructor. The problem > is related to this new code that was added between 1.4.1 and 1.5.0 > // `listener` should be only used in the driver > @transient private[sql] val listener = new SQLListener(this) > sparkContext.addSparkListener(listener) > sparkContext.ui.foreach(new SQLTab(this, _)) > ..which generates > Exception in thread "main" java.lang.NullPointerException > at > org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34) > at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org