Ben,

That looks like a scala version mismatch. Have you checked your dep tree?

Asher Krim
Senior Software Engineer

On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuil...@gmail.com> wrote:

> Elek,
>
> Can you give me some sample code? I can’t get mine to work.
>
> import org.apache.spark.sql.{SQLContext, _}
> import org.apache.spark.sql.execution.datasources.hbase._
> import org.apache.spark.{SparkConf, SparkContext}
>
> def cat = s"""{
>     |"table":{"namespace":"ben", "name":"dmp_test",
> "tableCoder":"PrimitiveType"},
>     |"rowkey":"key",
>     |"columns":{
>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>     |}
> |}""".stripMargin
>
> import sqlContext.implicits._
>
> def withCatalog(cat: String): DataFrame = {
>     sqlContext
>         .read
>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>         .format("org.apache.spark.sql.execution.datasources.hbase")
>         .load()
> }
>
> val df = withCatalog(cat)
> df.show
>
>
> It gives me this error.
>
> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.
> create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
> at org.apache.spark.sql.execution.datasources.hbase.
> HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(
> HBaseRelation.scala:77)
> at org.apache.spark.sql.execution.datasources.hbase.
> DefaultSource.createRelation(HBaseRelation.scala:51)
> at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(
> ResolvedDataSource.scala:158)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>
>
> If you can please help, I would be grateful.
>
> Cheers,
> Ben
>
>
> On Jan 31, 2017, at 1:02 PM, Marton, Elek <h...@anzix.net> wrote:
>
>
> I tested this one with hbase 1.2.4:
>
> https://github.com/hortonworks-spark/shc
>
> Marton
>
> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>
> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I
> tried to build it from source, but I cannot get it to work.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>
>

Reply via email to