Sorry for my persistence, but did you actually run "mvn dependency:tree
-Dverbose=true"? And did you see only scala 2.10.5 being pulled in?

On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim <bbuil...@gmail.com> wrote:

> Asher,
>
> It’s still the same. Do you have any other ideas?
>
> Cheers,
> Ben
>
>
> On Feb 3, 2017, at 8:16 AM, Asher Krim <ak...@hubspot.com> wrote:
>
> Did you check the actual maven dep tree? Something might be pulling in a
> different version. Also, if you're seeing this locally, you might want to
> check which version of the scala sdk your IDE is using
>
> Asher Krim
> Senior Software Engineer
>
> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>
>> Hi Asher,
>>
>> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java
>> (1.8) version as our installation. The Scala (2.10.5) version is already
>> the same as ours. But I’m still getting the same error. Can you think of
>> anything else?
>>
>> Cheers,
>> Ben
>>
>>
>> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
>>
>> Ben,
>>
>> That looks like a scala version mismatch. Have you checked your dep tree?
>>
>> Asher Krim
>> Senior Software Engineer
>>
>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>>
>>> Elek,
>>>
>>> Can you give me some sample code? I can’t get mine to work.
>>>
>>> import org.apache.spark.sql.{SQLContext, _}
>>> import org.apache.spark.sql.execution.datasources.hbase._
>>> import org.apache.spark.{SparkConf, SparkContext}
>>>
>>> def cat = s"""{
>>>     |"table":{"namespace":"ben", "name":"dmp_test",
>>> "tableCoder":"PrimitiveType"},
>>>     |"rowkey":"key",
>>>     |"columns":{
>>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>>     |}
>>> |}""".stripMargin
>>>
>>> import sqlContext.implicits._
>>>
>>> def withCatalog(cat: String): DataFrame = {
>>>     sqlContext
>>>         .read
>>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>>         .load()
>>> }
>>>
>>> val df = withCatalog(cat)
>>> df.show
>>>
>>>
>>> It gives me this error.
>>>
>>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create
>>> (Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseTableC
>>> atalog$.apply(HBaseTableCatalog.scala:232)
>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelati
>>> on.<init>(HBaseRelation.scala:77)
>>> at org.apache.spark.sql.execution.datasources.hbase.DefaultSour
>>> ce.createRelation(HBaseRelation.scala:51)
>>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>>> e$.apply(ResolvedDataSource.scala:158)
>>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>>>
>>>
>>> If you can please help, I would be grateful.
>>>
>>> Cheers,
>>> Ben
>>>
>>>
>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <h...@anzix.net> wrote:
>>>
>>>
>>> I tested this one with hbase 1.2.4:
>>>
>>> https://github.com/hortonworks-spark/shc
>>>
>>> Marton
>>>
>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>
>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0?
>>> I tried to build it from source, but I cannot get it to work.
>>>
>>> Thanks,
>>> Ben
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>
>>>
>>
>>
>
>

Reply via email to