[
https://issues.apache.org/jira/browse/HBASE-18570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Nihal Jain resolved HBASE-18570.
--------------------------------
Fix Version/s: hbase-connectors-1.1.0
Hadoop Flags: Reviewed
Resolution: Fixed
> Fix NPE when HBaseContext was never initialized
> -----------------------------------------------
>
> Key: HBASE-18570
> URL: https://issues.apache.org/jira/browse/HBASE-18570
> Project: HBase
> Issue Type: Bug
> Affects Versions: 1.2.0
> Reporter: Yuexin Zhang
> Assignee: Junegunn Choi
> Priority: Minor
> Fix For: hbase-connectors-1.1.0
>
>
> I recently run into the same issue as described in stackoverflow :
> https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#
> If we don't explicitly initialize a HBaseContext and don't set
> hbase.use.hbase.context option to false, it will run into NPE at:
> {code}
> val wrappedConf = new SerializableConfiguration(hbaseContext.config)
> {code}
> https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140
> Should we safe guard with a NULL validation on hbaseContext?
> Something like:
> {code}
> //create or get latest HBaseContext
> val hbaseContext:HBaseContext = if (useHBaseContext && null !=
> LatestHBaseContextCache.latest) {
> LatestHBaseContextCache.latest
> } else {
> val config = HBaseConfiguration.create()
> configResources.split(",").foreach( r => config.addResource(r))
> new HBaseContext(sqlContext.sparkContext, config)
> }
> {code}
> Or maybe it's better to make sure the HBaseContext is instantiated properly.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)