[jira] [Updated] (HBASE-18570) use hbase-spark without HBaseContext runs into NPE

2017-08-11 Thread Yuexin Zhang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HBASE-18570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuexin Zhang updated HBASE-18570:
-
Description: 
I recently run into the same issue as described in stackoverflow :

https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#

If we don't explicitly initialize a HBaseContext and don't set 
hbase.use.hbase.context option to false, it will run into NPE at:
{code}
val wrappedConf = new SerializableConfiguration(hbaseContext.config)
{code}

https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140

Should we safe guard with a NULL validation  on hbaseContext?

Something like: 
{code}
//create or get latest HBaseContext
  val hbaseContext:HBaseContext = if (useHBaseContext && null != 
LatestHBaseContextCache.latest) {
LatestHBaseContextCache.latest
  } else {
val config = HBaseConfiguration.create()
configResources.split(",").foreach( r => config.addResource(r))
new HBaseContext(sqlContext.sparkContext, config)
  }
{code}

Or maybe it's better to make sure the HBaseContext is instantiated properly.


  was:
I recently run into the same issue as described in stackoverflow :

https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#

If we don't explicitly initialize a HBaseContext and don't set 
hbase.use.hbase.context option to false, it will run into NPE at:
{code}
val wrappedConf = new SerializableConfiguration(hbaseContext.config)
{code}

https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140

Should we safe guard with a NULL validation  on hbaseContext?

Something like: 
{code}
//create or get latest HBaseContext
  val hbaseContext:HBaseContext = if (useHBaseContext && null != 
LatestHBaseContextCache.latest) {
LatestHBaseContextCache.latest
  } else {
val config = HBaseConfiguration.create()
configResources.split(",").foreach( r => config.addResource(r))
new HBaseContext(sqlContext.sparkContext, config)
  }
{code}





> use hbase-spark without HBaseContext runs into NPE
> --
>
> Key: HBASE-18570
> URL: https://issues.apache.org/jira/browse/HBASE-18570
> Project: HBase
>  Issue Type: Improvement
>  Components: hbase
>Affects Versions: 1.2.0
>Reporter: Yuexin Zhang
>Priority: Minor
>
> I recently run into the same issue as described in stackoverflow :
> https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#
> If we don't explicitly initialize a HBaseContext and don't set 
> hbase.use.hbase.context option to false, it will run into NPE at:
> {code}
> val wrappedConf = new SerializableConfiguration(hbaseContext.config)
> {code}
> https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140
> Should we safe guard with a NULL validation  on hbaseContext?
> Something like: 
> {code}
> //create or get latest HBaseContext
>   val hbaseContext:HBaseContext = if (useHBaseContext && null != 
> LatestHBaseContextCache.latest) {
> LatestHBaseContextCache.latest
>   } else {
> val config = HBaseConfiguration.create()
> configResources.split(",").foreach( r => config.addResource(r))
> new HBaseContext(sqlContext.sparkContext, config)
>   }
> {code}
> Or maybe it's better to make sure the HBaseContext is instantiated properly.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (HBASE-18570) use hbase-spark without HBaseContext runs into NPE

2017-08-11 Thread Yuexin Zhang (JIRA)
Yuexin Zhang created HBASE-18570:


 Summary: use hbase-spark without HBaseContext runs into NPE
 Key: HBASE-18570
 URL: https://issues.apache.org/jira/browse/HBASE-18570
 Project: HBase
  Issue Type: Improvement
  Components: hbase
Affects Versions: 1.2.0
Reporter: Yuexin Zhang
Priority: Minor


I recently run into the same issue as described in stackoverflow :

https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#

If we don't explicitly initialize a HBaseContext and don't set 
hbase.use.hbase.context option to false, it will run into NPE at:
{code}
val wrappedConf = new SerializableConfiguration(hbaseContext.config)
{code}

https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140

Should we safe guard with a NULL validation  on hbaseContext?

Something like: 
{code}
//create or get latest HBaseContext
  val hbaseContext:HBaseContext = if (useHBaseContext && null != 
LatestHBaseContextCache.latest) {
LatestHBaseContextCache.latest
  } else {
val config = HBaseConfiguration.create()
configResources.split(",").foreach( r => config.addResource(r))
new HBaseContext(sqlContext.sparkContext, config)
  }
{code}






--
This message was sent by Atlassian JIRA
(v6.4.14#64029)