Re: HBase Spark

2017-02-03 Thread Benjamin Kim
Asher, I found a profile for Spark 2.11 and removed it. Now, it brings in 2.10. I ran some code and got further. Now, I get this error below when I do a “df.show”. java.lang.AbstractMethodError at org.apache.spark.Logging$class.log(Logging.scala:50) at

Re: HBase Spark

2017-02-03 Thread Asher Krim
You can see in the tree what's pulling in 2.11. Your option then will be to either shade them and add an explicit dependency on 2.10.5 in your pom. Alternatively, you can explore upgrading your project to 2.11 (which will require using a 2_11 build of spark) On Fri, Feb 3, 2017 at 2:03 PM,

Re: HBase Spark

2017-02-03 Thread Benjamin Kim
Asher, You’re right. I don’t see anything but 2.11 being pulled in. Do you know where I can change this? Cheers, Ben > On Feb 3, 2017, at 10:50 AM, Asher Krim wrote: > > Sorry for my persistence, but did you actually run "mvn dependency:tree > -Dverbose=true"? And did

Re: HBase Spark

2017-02-03 Thread Asher Krim
Sorry for my persistence, but did you actually run "mvn dependency:tree -Dverbose=true"? And did you see only scala 2.10.5 being pulled in? On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim wrote: > Asher, > > It’s still the same. Do you have any other ideas? > > Cheers, > Ben >

Re: HBase Spark

2017-02-03 Thread Benjamin Kim
Asher, It’s still the same. Do you have any other ideas? Cheers, Ben > On Feb 3, 2017, at 8:16 AM, Asher Krim wrote: > > Did you check the actual maven dep tree? Something might be pulling in a > different version. Also, if you're seeing this locally, you might want to >

Re: HBase Spark

2017-02-03 Thread Benjamin Kim
I'll clean up any .m2 or .ivy directories. And try again. I ran this on our lab cluster for testing. Cheers, Ben On Fri, Feb 3, 2017 at 8:16 AM Asher Krim wrote: > Did you check the actual maven dep tree? Something might be pulling in a > different version. Also, if you're

Re: HBase Spark

2017-02-03 Thread Asher Krim
Did you check the actual maven dep tree? Something might be pulling in a different version. Also, if you're seeing this locally, you might want to check which version of the scala sdk your IDE is using Asher Krim Senior Software Engineer On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim

Re: HBase Spark

2017-02-02 Thread Benjamin Kim
Hi Asher, I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8) version as our installation. The Scala (2.10.5) version is already the same as ours. But I’m still getting the same error. Can you think of anything else? Cheers, Ben > On Feb 2, 2017, at 11:06 AM, Asher

Re: HBase Spark

2017-02-02 Thread Asher Krim
Ben, That looks like a scala version mismatch. Have you checked your dep tree? Asher Krim Senior Software Engineer On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim wrote: > Elek, > > Can you give me some sample code? I can’t get mine to work. > > import

Re: HBase Spark

2017-02-02 Thread Benjamin Kim
Elek, Can you give me some sample code? I can’t get mine to work. import org.apache.spark.sql.{SQLContext, _} import org.apache.spark.sql.execution.datasources.hbase._ import org.apache.spark.{SparkConf, SparkContext} def cat = s"""{ |"table":{"namespace":"ben", "name":"dmp_test",

Re: HBase Spark

2017-01-31 Thread Benjamin Kim
Elek, If I cannot use the HBase Spark module, then I’ll give it a try. Thanks, Ben > On Jan 31, 2017, at 1:02 PM, Marton, Elek wrote: > > > I tested this one with hbase 1.2.4: > > https://github.com/hortonworks-spark/shc > > Marton > > On 01/31/2017 09:17 PM, Benjamin Kim

Re: HBase Spark

2017-01-31 Thread Marton, Elek
I tested this one with hbase 1.2.4: https://github.com/hortonworks-spark/shc Marton On 01/31/2017 09:17 PM, Benjamin Kim wrote: Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work. Thanks, Ben

RE: HBase-Spark Module

2016-07-29 Thread David Newberger
Hi Ben, This seems more like a question for community.cloudera.com. However, it would be in hbase not spark I believe. https://repository.cloudera.com/artifactory/webapp/#/artifacts/browse/tree/General/cloudera-release-repo/org/apache/hbase/hbase-spark David Newberger -Original

Re: HBase / Spark Kerberos problem

2016-05-19 Thread Arun Natva
in/scala/org/apache/spark/deploy/yarn/Client.scala > [2] > https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala > > > From: John Trengrove [mailto:john.trengr...@servian.com.au] > Sent: 19 May 2016 08:09 > To

RE: HBase / Spark Kerberos problem

2016-05-19 Thread philipp.meyerhoefer
to credentials” and the .count() on my HBase RDD works fine. From: Ellis, Tom (Financial Markets IT) [mailto:tom.el...@lloydsbanking.com] Sent: 19 May 2016 09:51 To: 'John Trengrove'; Meyerhoefer, Philipp (TR Technology & Ops) Cc: user Subject: RE: HBase / Spark Kerberos problem Yeah we ran in

RE: HBase / Spark Kerberos problem

2016-05-19 Thread Ellis, Tom (Financial Markets IT)
ngrove [mailto:john.trengr...@servian.com.au] Sent: 19 May 2016 08:09 To: philipp.meyerhoe...@thomsonreuters.com Cc: user Subject: Re: HBase / Spark Kerberos problem -- This email has reached the Bank via an external source -- Have you had a look at this issue? https://issues.apache.org/jira/browse/SPARK

Re: HBase / Spark Kerberos problem

2016-05-19 Thread John Trengrove
Have you had a look at this issue? https://issues.apache.org/jira/browse/SPARK-12279 There is a comment by Y Bodnar on how they successfully got Kerberos and HBase working. 2016-05-18 18:13 GMT+10:00 : > Hi all, > > I have been puzzling over a Kerberos

Re: HBase Spark Streaming giving error after restore

2015-10-17 Thread Amit Hora
Hi, Regresta for delayed resoonse please find below full stack trace ava.lang.ClassCastException: scala.runtime.BoxedUnit cannot be cast to org.apache.hadoop.hbase.client.Mutation at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:85) at

Re: HBase Spark Streaming giving error after restore

2015-10-17 Thread Aniket Bhatnagar
Can you try changing classOf[OutputFormat[String, BoxedUnit]] to classOf[OutputFormat[String, Put]] while configuring hconf? On Sat, Oct 17, 2015, 11:44 AM Amit Hora wrote: > Hi, > > Regresta for delayed resoonse > please find below full stack trace > >

Re: HBase Spark Streaming giving error after restore

2015-10-16 Thread Ted Yu
Can you show the complete stack trace ? Subclass of Mutation is expected. Put is a subclass. Have you tried replacing BoxedUnit with Put in your code ? Cheers On Fri, Oct 16, 2015 at 6:02 AM, Amit Singh Hora wrote: > Hi All, > > I am using below code to stream data from

Re: Hbase Spark streaming issue.

2015-09-24 Thread Shixiong Zhu
Looks like you have an incompatible hbase-default.xml in some place. You can use the following code to find the location of "hbase-default.xml" println(Thread.currentThread().getContextClassLoader().getResource("hbase-default.xml")) Best Regards, Shixiong Zhu 2015-09-21 15:46 GMT+08:00 Siva