Hi Guys- am not sure whether the email is reaching to the community members. 
Please can somebody acknowledge 

Sent from my iPhone

> On 30-Sep-2017, at 5:02 PM, Debabrata Ghosh <mailford...@gmail.com> wrote:
> 
> Dear All,
>                Greetings ! I am repeatedly hitting a NullPointerException 
> error while saving a Scala Dataframe to HBase. Please can you help resolving 
> this for me. Here is the code snippet:
> 
> scala> def catalog = s"""{
>      |        |"table":{"namespace":"default", "name":"table1"},
>      |        |"rowkey":"key",
>      |        |"columns":{
>      |          |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>      |          |"col1":{"cf":"cf1", "col":"col1", "type":"string"}
>      |        |}
>      |      |}""".stripMargin
> catalog: String
> 
> scala> case class HBaseRecord(
>      |    col0: String,
>      |    col1: String)
> defined class HBaseRecord
> 
> scala> val data = (0 to 255).map { i =>  HBaseRecord(i.toString, "extra")}
> data: scala.collection.immutable.IndexedSeq[HBaseRecord] = 
> Vector(HBaseRecord(0,extra), HBaseRecord(1,extra), HBaseRecord
> 
> (2,extra), HBaseRecord(3,extra), HBaseRecord(4,extra), HBaseRecord(5,extra), 
> HBaseRecord(6,extra), HBaseRecord(7,extra), 
> 
> HBaseRecord(8,extra), HBaseRecord(9,extra), HBaseRecord(10,extra), 
> HBaseRecord(11,extra), HBaseRecord(12,extra), 
> 
> HBaseRecord(13,extra), HBaseRecord(14,extra), HBaseRecord(15,extra), 
> HBaseRecord(16,extra), HBaseRecord(17,extra), 
> 
> HBaseRecord(18,extra), HBaseRecord(19,extra), HBaseRecord(20,extra), 
> HBaseRecord(21,extra), HBaseRecord(22,extra), 
> 
> HBaseRecord(23,extra), HBaseRecord(24,extra), HBaseRecord(25,extra), 
> HBaseRecord(26,extra), HBaseRecord(27,extra), 
> 
> HBaseRecord(28,extra), HBaseRecord(29,extra), HBaseRecord(30,extra), 
> HBaseRecord(31,extra), HBase...
> 
> scala> import org.apache.spark.sql.datasources.hbase
> import org.apache.spark.sql.datasources.hbase
>                                                          
> 
> scala> import org.apache.spark.sql.datasources.hbase.{HBaseTableCatalog}
> import org.apache.spark.sql.datasources.hbase.HBaseTableCatalog
> 
> scala> 
> sc.parallelize(data).toDF.write.options(Map(HBaseTableCatalog.tableCatalog -> 
> catalog, HBaseTableCatalog.newTable -> 
> 
> "5")).format("org.apache.hadoop.hbase.spark").save()
> 
> java.lang.NullPointerException
>   at 
> org.apache.hadoop.hbase.spark.HBaseRelation.<init>(DefaultSource.scala:134)
>   at 
> org.apache.hadoop.hbase.spark.DefaultSource.createRelation(DefaultSource.scala:75)
>   at 
> org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
>   at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
>   ... 56 elided
> 
> 
> Thanks in advance !
> 
> Debu
> 

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to