Asher,
I found a profile for Spark 2.11 and removed it. Now, it brings in 2.10. I ran
some code and got further. Now, I get this error below when I do a “df.show”.
java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:50)
at
You can see in the tree what's pulling in 2.11. Your option then will be to
either shade them and add an explicit dependency on 2.10.5 in your pom.
Alternatively, you can explore upgrading your project to 2.11 (which will
require using a 2_11 build of spark)
On Fri, Feb 3, 2017 at 2:03 PM,
Asher,
You’re right. I don’t see anything but 2.11 being pulled in. Do you know where
I can change this?
Cheers,
Ben
> On Feb 3, 2017, at 10:50 AM, Asher Krim wrote:
>
> Sorry for my persistence, but did you actually run "mvn dependency:tree
> -Dverbose=true"? And did
Sorry for my persistence, but did you actually run "mvn dependency:tree
-Dverbose=true"? And did you see only scala 2.10.5 being pulled in?
On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim wrote:
> Asher,
>
> It’s still the same. Do you have any other ideas?
>
> Cheers,
> Ben
>
Asher,
It’s still the same. Do you have any other ideas?
Cheers,
Ben
> On Feb 3, 2017, at 8:16 AM, Asher Krim wrote:
>
> Did you check the actual maven dep tree? Something might be pulling in a
> different version. Also, if you're seeing this locally, you might want to
>
I'll clean up any .m2 or .ivy directories. And try again.
I ran this on our lab cluster for testing.
Cheers,
Ben
On Fri, Feb 3, 2017 at 8:16 AM Asher Krim wrote:
> Did you check the actual maven dep tree? Something might be pulling in a
> different version. Also, if you're
Did you check the actual maven dep tree? Something might be pulling in a
different version. Also, if you're seeing this locally, you might want to
check which version of the scala sdk your IDE is using
Asher Krim
Senior Software Engineer
On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim
Hi Asher,
I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8)
version as our installation. The Scala (2.10.5) version is already the same as
ours. But I’m still getting the same error. Can you think of anything else?
Cheers,
Ben
> On Feb 2, 2017, at 11:06 AM, Asher
Ben,
That looks like a scala version mismatch. Have you checked your dep tree?
Asher Krim
Senior Software Engineer
On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim wrote:
> Elek,
>
> Can you give me some sample code? I can’t get mine to work.
>
> import
Elek,
Can you give me some sample code? I can’t get mine to work.
import org.apache.spark.sql.{SQLContext, _}
import org.apache.spark.sql.execution.datasources.hbase._
import org.apache.spark.{SparkConf, SparkContext}
def cat = s"""{
|"table":{"namespace":"ben", "name":"dmp_test",
Elek,
If I cannot use the HBase Spark module, then I’ll give it a try.
Thanks,
Ben
> On Jan 31, 2017, at 1:02 PM, Marton, Elek wrote:
>
>
> I tested this one with hbase 1.2.4:
>
> https://github.com/hortonworks-spark/shc
>
> Marton
>
> On 01/31/2017 09:17 PM, Benjamin Kim
I tested this one with hbase 1.2.4:
https://github.com/hortonworks-spark/shc
Marton
On 01/31/2017 09:17 PM, Benjamin Kim wrote:
Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried
to build it from source, but I cannot get it to work.
Thanks,
Ben
Hi Ben,
This seems more like a question for community.cloudera.com. However, it would
be in hbase not spark I believe.
https://repository.cloudera.com/artifactory/webapp/#/artifacts/browse/tree/General/cloudera-release-repo/org/apache/hbase/hbase-spark
David Newberger
-Original
in/scala/org/apache/spark/deploy/yarn/Client.scala
> [2]
> https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala
>
>
> From: John Trengrove [mailto:john.trengr...@servian.com.au]
> Sent: 19 May 2016 08:09
> To
to credentials” and the
.count() on my HBase RDD works fine.
From: Ellis, Tom (Financial Markets IT) [mailto:tom.el...@lloydsbanking.com]
Sent: 19 May 2016 09:51
To: 'John Trengrove'; Meyerhoefer, Philipp (TR Technology & Ops)
Cc: user
Subject: RE: HBase / Spark Kerberos problem
Yeah we ran in
ngrove [mailto:john.trengr...@servian.com.au]
Sent: 19 May 2016 08:09
To: philipp.meyerhoe...@thomsonreuters.com
Cc: user
Subject: Re: HBase / Spark Kerberos problem
-- This email has reached the Bank via an external source --
Have you had a look at this issue?
https://issues.apache.org/jira/browse/SPARK
Have you had a look at this issue?
https://issues.apache.org/jira/browse/SPARK-12279
There is a comment by Y Bodnar on how they successfully got Kerberos and
HBase working.
2016-05-18 18:13 GMT+10:00 :
> Hi all,
>
> I have been puzzling over a Kerberos
Hi,
Regresta for delayed resoonse
please find below full stack trace
ava.lang.ClassCastException: scala.runtime.BoxedUnit cannot be cast to
org.apache.hadoop.hbase.client.Mutation
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:85)
at
Can you try changing classOf[OutputFormat[String,
BoxedUnit]] to classOf[OutputFormat[String,
Put]] while configuring hconf?
On Sat, Oct 17, 2015, 11:44 AM Amit Hora wrote:
> Hi,
>
> Regresta for delayed resoonse
> please find below full stack trace
>
>
Can you show the complete stack trace ?
Subclass of Mutation is expected. Put is a subclass.
Have you tried replacing BoxedUnit with Put in your code ?
Cheers
On Fri, Oct 16, 2015 at 6:02 AM, Amit Singh Hora
wrote:
> Hi All,
>
> I am using below code to stream data from
Looks like you have an incompatible hbase-default.xml in some place. You
can use the following code to find the location of "hbase-default.xml"
println(Thread.currentThread().getContextClassLoader().getResource("hbase-default.xml"))
Best Regards,
Shixiong Zhu
2015-09-21 15:46 GMT+08:00 Siva
21 matches
Mail list logo