[ 
https://issues.apache.org/jira/browse/SPARK-10550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14743642#comment-14743642
 ] 

Sean Owen commented on SPARK-10550:
-----------------------------------

It's marked {{protected[sql]}} which means it is not accessible outside 
{{org.apache.spark.sql}}. It can't be an API as such, not even 'experimental'. 
You're kind of at your own risk if you're trying to access things like this, as 
they may change from version to version. (It ends up being merely "protected" 
in the bytecode since the JVM has no similar notion of "protected with respect 
to a package" though.) This is why I'm not sure this can be considered a 'bug' 
as I understand what you're trying to do.

> SQLListener error constructing extended SQLContext 
> ---------------------------------------------------
>
>                 Key: SPARK-10550
>                 URL: https://issues.apache.org/jira/browse/SPARK-10550
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: shao lo
>            Priority: Minor
>
> With spark 1.4.1 I was able to created a custom SQLContext class.  With spark 
> 1.5.0, I now get an error  calling the super class constructor.  The problem 
> is related to this new code that was added between 1.4.1 and 1.5.0
>   // `listener` should be only used in the driver
>   @transient private[sql] val listener = new SQLListener(this)
>   sparkContext.addSparkListener(listener)
>   sparkContext.ui.foreach(new SQLTab(this, _))
> ..which generates 
> Exception in thread "main" java.lang.NullPointerException
>       at 
> org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
>       at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to