org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory ClassNotFoundException

2023-11-07 Thread Yi Zheng
Hi, The problem I’ve encountered is: after “spark-shell” command, when I first enter “spark.sql("select * from test.test_3 ").show(false)” command, it throws “ERROR session.SessionState: Error setting up authorization: java.lang.ClassNotFoundException:

Help with ClassNotFoundException: org.apache.spark.internal.io.cloud.PathOutputCommitProtocol

2022-12-30 Thread Meharji Arumilli
Dear community members, I am using apache pyspark for the first time and have done all the configurations. However, I am not able to write files to local storage. I have described the Issue here https://stackoverflow.com/questions/74962675/how-to-fix-java-lang-classnotfoundexception-org-apache

Java Generic T makes ClassNotFoundException

2019-06-27 Thread big data
) final T obj = (T) in.readObject(); return obj; } catch (final ClassNotFoundException | IOException ex) { throw new SerializationException(ex); } } In the Spark local mode, the code runs OK. But in Cluster On Yarn mode, Spark code runs error like this: org.apache

RE: ClassNotFoundException while unmarshalling a remote RDD on Spark 1.5.1

2017-09-12 Thread PICARD Damien
: PICARD Damien (EXT) AssuResPriSmsAts Envoyé : lundi 11 septembre 2017 08:53 À : 'user@spark.apache.org' Objet : ClassNotFoundException while unmarshalling a remote RDD on Spark 1.5.1 Hi ! I'm facing a Classloader problem using Spark 1.5.1 I use javax.validation and hibernate validation annotations

ClassNotFoundException while unmarshalling a remote RDD on Spark 1.5.1

2017-09-11 Thread PICARD Damien
), I get the ClassNotFoundException : 17/09/07 09:19:25 INFO storage.BlockManager: Found block rdd_8_1 remotely 17/09/07 09:19:25 ERROR executor.Executor: Exception in task 3.0 in stage 2.0 (TID 6) java.lang.ClassNotFoundException: org.hibernate.validator.constraints.NotBlank

Re: --jars from spark-submit on master on YARN don't get added properly to the executors - ClassNotFoundException

2017-08-09 Thread Mikhailau, Alex
Thanks, Marcelo. Will give it a shot tomorrow. -Alex On 8/9/17, 5:59 PM, "Marcelo Vanzin" wrote: Jars distributed using --jars are not added to the system classpath, so log4j cannot see them. To work around that, you need to manually add the *name* jar to

Re: --jars from spark-submit on master on YARN don't get added properly to the executors - ClassNotFoundException

2017-08-09 Thread Marcelo Vanzin
Jars distributed using --jars are not added to the system classpath, so log4j cannot see them. To work around that, you need to manually add the *name* jar to the driver executor classpaths: spark.driver.extraClassPath=some.jar spark.executor.extraClassPath=some.jar In client mode you should

--jars from spark-submit on master on YARN don't get added properly to the executors - ClassNotFoundException

2017-08-09 Thread Mikhailau, Alex
I have log4j json layout jars added via spark-submit on EMR /usr/lib/spark/bin/spark-submit --deploy-mode cluster --master yarn --jars /home/hadoop/lib/jsonevent-layout-1.7.jar,/home/hadoop/lib/json-smart-1.1.1.jar --driver-java-options "-XX:+AlwaysPreTouch -XX:MaxPermSize=6G" --class

Re: ClassNotFoundException for Workers

2017-07-31 Thread Noppanit Charassinvichai
I've included that in my build file for the fat jar already. libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.11.155" libraryDependencies += "com.amazonaws" % "aws-java-sdk-s3" % "1.11.155" libraryDependencies += "com.amazonaws" % "aws-java-sdk-core" % "1.11.155" Not sure if I need

Re: ClassNotFoundException for Workers

2017-07-25 Thread 周康
Ensure com.amazonaws.services.s3.AmazonS3ClientBuilder in your classpath which include your application jar and attached executor jars. 2017-07-20 6:12 GMT+08:00 Noppanit Charassinvichai : > I have this spark job which is using S3 client in mapPartition. And I get > this

ClassNotFoundException for Workers

2017-07-19 Thread Noppanit Charassinvichai
I have this spark job which is using S3 client in mapPartition. And I get this error Job aborted due to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 74, ip-10-90-78-177.ec2.internal, executor 11): java.lang.NoClassDefFoundError: Could not

Re: Sporadic ClassNotFoundException with Kryo

2017-01-12 Thread Nirmal Fernando
he jar has been deployed on every machine on > the local file system in the same location. > > I would be very grateful for any help or ideas :) > > > > -- > View this message in context: http://apache-spark-user-list. > 1001560.n3.nabble.com/Sporadic-ClassNotFoundExce

Sporadic ClassNotFoundException with Kryo

2016-11-18 Thread chrism
ser-list.1001560.n3.nabble.com/Sporadic-ClassNotFoundException-with-Kryo-tp28104.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Spark streaming 2, giving error ClassNotFoundException: scala.collection.GenTraversableOnce$class

2016-08-19 Thread Mich Talebzadeh
Thanks --jars /home/hduser/jars/spark-streaming-kafka-assembly_*2.11*-1.6.1.jar sorted it out Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw *

Re: Spark streaming 2, giving error ClassNotFoundException: scala.collection.GenTraversableOnce$class

2016-08-19 Thread Tathagata Das
You seem to combining Scala 2.10 and 2.11 libraries - your sbt project is 2.11, where as you are trying to pull in spark-streaming-kafka-assembly_ *2.10*-1.6.1.jar. On Fri, Aug 19, 2016 at 11:24 AM, Mich Talebzadeh wrote: > Hi, > > My spark streaming app with 1.6.1

Spark streaming 2, giving error ClassNotFoundException: scala.collection.GenTraversableOnce$class

2016-08-19 Thread Mich Talebzadeh
Hi, My spark streaming app with 1.6.1 used to work. Now with scala> sc version res0: String = 2.0.0 Compiling with sbt assembly as before, with the following: version := "1.0", scalaVersion := "2.11.8", mainClass in Compile := Some("myPackage.${APPLICATION}") )

Re: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Aditya
: "颜发才(Yan Facai)" <yaf...@gmail.com> Date:18/08/2016 15:17 (GMT+05:30) To: "user.spark" <user@spark.apache.org> Cc: Subject: [Spark 2.0] ClassNotFoundException is thrown when using Hive Hi, all. I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOM

RE: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Diwakar Dhanuskodi
ark.apache.org> Cc: Subject: [Spark 2.0] ClassNotFoundException is thrown when using Hive Hi, all. I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOME/conf. And spark-submit is used to submit task to yarn, and run as **client** mode. However, ClassNotFoundException is thrown

Re: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Mich Talebzadeh
nd run as **client** > mode. > However, ClassNotFoundException is thrown. > > some details of logs are list below: > ``` > 16/08/12 17:07:32 INFO hive.HiveUtils: Initializing > HiveMetastoreConnection version 0.13.1 using file:/data0/facai/lib/hive-0.1 > 3.1/lib:file:/data0/f

[Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Yan Facai
Hi, all. I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOME/conf. And spark-submit is used to submit task to yarn, and run as **client** mode. However, ClassNotFoundException is thrown. some details of logs are list below: ``` 16/08/12 17:07:32 INFO hive.HiveUtils

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Carlo . Allocca
Thanks Marcelo. Problem solved. Best, Carlo Hi Marcelo, Thanks you for your help. Problem solved as you suggested. Best Regards, Carlo > On 5 Aug 2016, at 18:34, Marcelo Vanzin wrote: > > On Fri, Aug 5, 2016 at 9:53 AM, Carlo.Allocca > wrote:

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Carlo . Allocca
I have also executed: mvn dependency:tree |grep log [INFO] | | +- com.esotericsoftware:minlog:jar:1.3.0:compile [INFO] +- log4j:log4j:jar:1.2.17:compile [INFO] +- org.slf4j:slf4j-log4j12:jar:1.7.16:compile [INFO] | | +- commons-logging:commons-logging:jar:1.1.3:compile and the POM

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Marcelo Vanzin
On Fri, Aug 5, 2016 at 9:53 AM, Carlo.Allocca wrote: > > org.apache.spark > spark-core_2.10 > 2.0.0 > jar > > > org.apache.spark > spark-sql_2.10 > 2.0.0 >

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Carlo . Allocca
Please Sean, could you detail the version mismatch? Many thanks, Carlo On 5 Aug 2016, at 18:11, Sean Owen > wrote: You also seem to have a version mismatch here. -- The Open University is incorporated by Royal Charter (RC 000391), an exempt

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Ted Yu
One option is to clone the class in your own project. Experts may have better solution. Cheers On Fri, Aug 5, 2016 at 10:10 AM, Carlo.Allocca wrote: > Hi Ted, > > Thanks for the promptly answer. > It is not yet clear to me what I should do. > > How to fix it? > >

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Carlo . Allocca
Hi Ted, Thanks for the promptly answer. It is not yet clear to me what I should do. How to fix it? Many thanks, Carlo On 5 Aug 2016, at 17:58, Ted Yu > wrote: private[spark] trait Logging { -- The Open University is incorporated by Royal

Re: ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Ted Yu
In 2.0, Logging became private: private[spark] trait Logging { FYI On Fri, Aug 5, 2016 at 9:53 AM, Carlo.Allocca wrote: > Dear All, > > I would like to ask for your help about the following issue: > java.lang.ClassNotFoundException: > org.apache.spark.Logging > > I

ClassNotFoundException org.apache.spark.Logging

2016-08-05 Thread Carlo . Allocca
Dear All, I would like to ask for your help about the following issue: java.lang.ClassNotFoundException: org.apache.spark.Logging I checked and the class Logging is not present. Moreover, the line of code where the exception is thrown final

Re: ClassNotFoundException: org.apache.parquet.hadoop.ParquetOutputCommitter

2016-07-07 Thread Bryan Cutler
Can you try running the example like this ./bin/run-example sql.RDDRelation I know there are some jars in the example folders, and running them this way adds them to the classpath On Jul 7, 2016 3:47 AM, "kevin" wrote: > hi,all: > I build spark use: > >

ClassNotFoundException: org.apache.parquet.hadoop.ParquetOutputCommitter

2016-07-07 Thread kevin
hi,all: I build spark use: ./make-distribution.sh --name "hadoop2.7.1" --tgz "-Pyarn,hadoop-2.6,parquet-provided,hive,hive-thriftserver" -DskipTests -Dhadoop.version=2.7.1 I can run example : ./bin/spark-submit --class org.apache.spark.examples.SparkPi \ --master spark://master1:7077 \

Re: Custom Log4j layout on YARN = ClassNotFoundException

2016-04-22 Thread andrew.rowson
ot; <user@spark.apache.org> Subject: Re: Custom Log4j layout on YARN = ClassNotFoundException There is not much in the body of email. Can you elaborate what issue you encountered ? Thanks On Fri, Apr 22, 2016 at 2:27 AM, Rowson, Andrew G. (TR Technology & O

Re: Custom Log4j layout on YARN = ClassNotFoundException

2016-04-22 Thread Ted Yu
t;user@spark.apache.org> > Cc: > Date: Fri, 22 Apr 2016 10:27:53 +0100 > Subject: Custom Log4j layout on YARN = ClassNotFoundException > > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org >

Custom Log4j layout on YARN = ClassNotFoundException

2016-04-22 Thread Rowson, Andrew G. (TR Technology & Ops)
This e-mail is for the sole use of the intended recipient and contains information that may be privileged and/or confidential. If you are not an intended recipient, please notify the sender by return e-mail and delete this e-mail and any attachments. Certain

Re: ClassNotFoundException in RDD.map

2016-03-23 Thread Dirceu Semighini Filho
at normally the typechecker > could catch, can slip through. > > On Thu, Mar 17, 2016 at 10:25 AM, Dirceu Semighini Filho > <dirceu.semigh...@gmail.com> wrote: > > Hi Ted, thanks for answering. > > The map is just that, whenever I try inside the map it throws this >

ClassNotFoundException in RDD.map

2016-03-20 Thread Dirceu Semighini Filho
Hello, I found a strange behavior after executing a prediction with MLIB. My code return an RDD[(Any,Double)] where Any is the id of my dataset, which is BigDecimal, and Double is the prediction for that line. When I run myRdd.take(10) it returns ok res16: Array[_ >: (Double, Double) <: (Any,

Re: ClassNotFoundException in RDD.map

2016-03-20 Thread Jakob Odersky
ap it throws this > ClassNotFoundException, even if I do map(f => f) it throws the exception. > What is bothering me is that when I do a take or a first it returns the > result, which make me conclude that the previous code isn't wrong. > > Kind Regards, > Dirceu > > > 2016-03-17 1

Re: ClassNotFoundException in RDD.map

2016-03-19 Thread Dirceu Semighini Filho
Hi Ted, thanks for answering. The map is just that, whenever I try inside the map it throws this ClassNotFoundException, even if I do map(f => f) it throws the exception. What is bothering me is that when I do a take or a first it returns the result, which make me conclude that the previous c

Re: ClassNotFoundException in RDD.map

2016-03-19 Thread Ted Yu
bq. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1 Do you mind showing more of your code involving the map() ? On Thu, Mar 17, 2016 at 8:32 AM, Dirceu Semighini Filho < dirceu.semigh...@gmail.com> wrote: > Hello, > I found a strange behavior after executing a prediction with MLIB. > My code

spark-submit with cluster deploy mode fails with ClassNotFoundException (jars are not passed around properley?)

2016-03-11 Thread Hiroyuki Yamada
Hi, I am trying to work with spark-submit with cluster deploy mode in single node, but I keep getting ClassNotFoundException as shown below. (in this case, snakeyaml.jar is not found from the spark cluster) === 16/03/12 14:19:12 INFO Remoting: Starting remoting 16/03/12 14:19:12 INFO Remoting

Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Ted Yu
-- > Date: Thu, 11 Feb 2016 17:29:00 -0800 > Subject: Re: Building Spark with a Custom Version of Hadoop: HDFS > ClassNotFoundException > From: yuzhih...@gmail.com > To: charliewri...@live.ca > CC: d...@spark.apache.org > > Hdfs class is in ha

Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Ted Yu
10 > > I am using the 1.6.0 release. > > > Charles. > > -- > Date: Thu, 11 Feb 2016 17:41:54 -0800 > Subject: Re: Building Spark with a Custom Version of Hadoop: HDFS > ClassNotFoundException > From: yuzhih...@gmail.com > To: char

ClassNotFoundException interpreting a Spark job

2016-01-16 Thread milad bourhani
Hi everyone, I’m trying to use the Scala interpreter, IMain, to interpret some Scala code that executes a job with Spark: @Test public void countToFive() throws ScriptException { SparkConf conf = new SparkConf().setAppName("Spark interpreter").setMaster("local[2]"); SparkContext sc =

Re: 1.6.0: Standalone application: Getting ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory

2016-01-14 Thread Egor Pahomov
My fault, I should have read documentation more accurate - http://spark.apache.org/docs/latest/sql-programming-guide.html precisely says, that I need to add these 3 jars to class path in case I need them. We can not include them in fat jar, because they OSGI and require to have plugin.xml and

1.6.0: Standalone application: Getting ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory

2016-01-12 Thread Egor Pahomov
Hi, I'm moving my infrastructure from 1.5.2 to 1.6.0 and experiencing serious issue. I successfully updated spark thrift server from 1.5.2 to 1.6.0. But I have standalone application, which worked fine with 1.5.2 but failing on 1.6.0 with: *NestedThrowables:* *java.lang.ClassNotFoundException:

Re: ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-29 Thread Prem Spark
you need make sure this class is accessible to all servers since its a cluster mode and drive can be on any of the worker nodes. On Fri, Dec 25, 2015 at 5:57 PM, Saiph Kappa wrote: > Hi, > > I'm submitting a spark job like this: > >

Re: ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-29 Thread Saiph Kappa
I found out that by commenting this line in the application code: sparkConf.set("spark.executor.extraJavaOptions", " -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+AggressiveOpts -XX:FreqInlineSize=300 -XX:MaxInlineSize=300 ") the exception does not occur anymore. Not entirely sure why, but

ClassNotFoundException when executing spark jobs in standalone/cluster mode on Spark 1.5.2

2015-12-25 Thread Saiph Kappa
Hi, I'm submitting a spark job like this: ~/spark-1.5.2-bin-hadoop2.6/bin/spark-submit --class Benchmark --master > spark://machine1:6066 --deploy-mode cluster --jars > target/scala-2.10/benchmark-app_2.10-0.1-SNAPSHOT.jar > /home/user/bench/target/scala-2.10/benchmark-app_2.10-0.1-SNAPSHOT.jar

jdbc error, ClassNotFoundException: org.apache.hadoop.hive.schshim.FairSchedulerShim

2015-12-03 Thread zhangjp
Hi all, I download the prebuilt version 1.5.2 with hadoop 2.6, when I use spark-sql there is no problem, but when i start thriftServer and then want to query hive table useing jdbc there is errors as follows. Caused by: java.lang.ClassNotFoundException:

ClassNotFoundException with a uber jar.

2015-11-26 Thread Marc de Palol
using uberjars with inner jars inside ? Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-with-a-uber-jar-tp25493.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: ClassNotFoundException with a uber jar.

2015-11-26 Thread Ali Tajeldin EDU
guava-2.4.3.jar, which is in the uberjar > > So I really don't know what I'm missing. I've tried to use --jars and > SparkContext.addJar (adding the uberjar) with no luck. > > Is there any problem using uberjars with inner jars inside ? > > Thanks! > > > >

log4j custom appender ClassNotFoundException with spark 1.5.2

2015-11-25 Thread lev
to spark submit) I've tried defining my appender in each one of the jars: in uber-jar: the appender is found and created successfully in main.jar or dep.jar: throws ClassNotFoundException I guess log4j tries to load the class before the assemblies were loaded it's related to this ticket: https

Re: Re: driver ClassNotFoundException when MySQL JDBC exceptions are thrown on executor

2015-11-19 Thread Zsolt Tóth
at org.apache.spark.repl.SparkILoop.org >> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) >> >> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) >> >> at org.apache.spark.repl.Main$.main(Main.scala:31) >> >>

Re: Re: driver ClassNotFoundException when MySQL JDBC exceptions are thrown on executor

2015-11-19 Thread Jeff Zhang
hod.invoke(Method.java:606) > > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674) > > at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) > > at > org.apache.spa

Re: kafka streaminf 1.5.2 - ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaReceiver

2015-11-17 Thread Tathagata Das
servlet-api > > > > > org.apache.spark > > spark-streaming_${scala.binary.version} > ${spark.version} > > > jcl-over-slf4j >

kafka streaminf 1.5.2 - ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaReceiver

2015-11-17 Thread tim_b123
org.mockito mockito-all test junit junit test joda-time joda-time org.apache.spark.streaming.kafka.KafkaReceiver is inside the spark-st

Re: ClassNotFoundException even if class is present in Jarfile

2015-11-03 Thread hveiga
It turned out to be a problem with `SerializationUtils` from Apache Commons Lang. There is an open issue where the class will throw a `ClassNotFoundException` even if the class is in the classpath in a multiple-classloader environment: https://issues.apache.org/jira/browse/LANG-1049 We moved away

Re: ClassNotFoundException even if class is present in Jarfile

2015-11-03 Thread Iulian Dragoș
onment (standalone or EMR) but it works successfully if I run it > locally using local[*] as master. > > I am getting ClassNotFoundException: com.mycompany.folder.MyObject on the > slave executors. I don't really understand why this is happening since I > have uncompressed the Jarfile to ma

Fwd: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Babar Tareen
Resending, haven't found a workaround. Any help is highly appreciated. -- Forwarded message -- From: Babar Tareen <babartar...@gmail.com> Date: Thu, Oct 22, 2015 at 2:47 PM Subject: Getting ClassNotFoundException: scala.Some on Spark 1.5.x To: user@spark.apache.org Hi,

Re: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Jonathan Coveney
> > Date: Thu, Oct 22, 2015 at 2:47 PM > Subject: Getting ClassNotFoundException: scala.Some on Spark 1.5.x > To: user@spark.apache.org > > > Hi, > > I am getting following exception when submitting a job to Spark 1.5.x from > Scala. The same code works with Spark 1.4.1.

Re: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Babar Tareen
> > 2015-11-02 14:48 GMT-05:00 Babar Tareen <babartar...@gmail.com>: > >> Resending, haven't found a workaround. Any help is highly appreciated. >> >> -- Forwarded message -- >> From: Babar Tareen <babartar...@gmail.com> >> Date: Th

Re: Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-11-02 Thread Jonathan Coveney
ny help is highly appreciated. >>> >>> -- Forwarded message -- >>> From: Babar Tareen <babartar...@gmail.com >>> <javascript:_e(%7B%7D,'cvml','babartar...@gmail.com');>> >>> Date: Thu, Oct 22, 2015 at 2:47 PM >>&

ClassNotFoundException even if class is present in Jarfile

2015-11-02 Thread hveiga
Hello, I am facing an issue where I cannot run my Spark job in a cluster environment (standalone or EMR) but it works successfully if I run it locally using local[*] as master. I am getting ClassNotFoundException: com.mycompany.folder.MyObject on the slave executors. I don't really understand

Re: driver ClassNotFoundException when MySQL JDBC exceptions are thrown on executor

2015-10-22 Thread Akhil Das
1 from Spark 1.4.1. > > When I use the MySQL JDBC connector and an exception (e.g. > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on > the executor, I get a ClassNotFoundException on the driver, which results > in this error (logs are abbreviated): > > 15/10/16 1

Getting ClassNotFoundException: scala.Some on Spark 1.5.x

2015-10-22 Thread Babar Tareen
Hi, I am getting following exception when submitting a job to Spark 1.5.x from Scala. The same code works with Spark 1.4.1. Any clues as to what might causing the exception. *Code:App.scala*import org.apache.spark.SparkContext object App { def main(args: Array[String]) = { val l =

Re: driver ClassNotFoundException when MySQL JDBC exceptions are thrown on executor

2015-10-22 Thread Xiao Li
from Spark 1.4.1. >> >> When I use the MySQL JDBC connector and an exception (e.g. >> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on >> the executor, I get a ClassNotFoundException on the driver, which results >> in this error (logs are abbreviat

driver ClassNotFoundException when MySQL JDBC exceptions are thrown on executor

2015-10-16 Thread Hurshal Patel
Hi all, I've been struggling with a particularly puzzling issue after upgrading to Spark 1.5.1 from Spark 1.4.1. When I use the MySQL JDBC connector and an exception (e.g. com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on the executor, I get a ClassNotFoundException

Spark 1.5.1 ClassNotFoundException in cluster mode.

2015-10-14 Thread Renato Perini
Hello. I have developed a Spark job using a jersey client (1.9 included with Spark) to make some service calls during data computations. Data is read and written on an Apache Cassandra 2.2.1 database. When I run the job in local mode, everything works nicely. But when I execute my job in

Re: Spark 1.5.1 ClassNotFoundException in cluster mode.

2015-10-14 Thread Dean Wampler
There is a Datastax Spark connector library jar file that you probably have on your CLASSPATH locally, but not on the cluster. If you know where it is, you could either install it on each node in some location on their CLASSPATHs or when you submit the mob, pass the jar file using the "--jars"

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-30 Thread Ted Yu
bq. have tried these settings with the hbase protocol jar, to no avail In that case, HBaseZeroCopyByteString is contained in hbase-protocol.jar. In HBaseZeroCopyByteString , you can see: package com.google.protobuf; // This is a lie. If protobuf jar is loaded ahead of hbase-protocol.jar,

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-30 Thread Dmitry Goldenberg
I believe I've had trouble with --conf spark.driver.userClassPathFirst=true --conf spark.executor.userClassPathFirst=true before, so these might not work... I was thinking of trying to add the solr4j jar to spark.executor.extraClassPath... On Wed, Sep 30, 2015 at 12:01 PM, Ted Yu

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Dmitry Goldenberg
Release of Spark: 1.5.0. Command line invokation: ACME_INGEST_HOME=/mnt/acme/acme-ingest ACME_INGEST_VERSION=0.0.1-SNAPSHOT ACME_BATCH_DURATION_MILLIS=5000 SPARK_MASTER_URL=spark://data1:7077 JAVA_OPTIONS="-Dspark.streaming.kafka.maxRatePerPartition=1000" JAVA_OPTIONS="$JAVA_OPTIONS

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Ted Yu
Mind providing a bit more information: release of Spark command line for running Spark job Cheers On Tue, Sep 29, 2015 at 1:37 PM, Dmitry Goldenberg wrote: > We're seeing this occasionally. Granted, this was caused by a wrinkle in > the Solr schema but this bubbled

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Ted Yu
Have you tried the following ? --conf spark.driver.userClassPathFirst=true --conf spark.executor. userClassPathFirst=true On Tue, Sep 29, 2015 at 4:38 PM, Dmitry Goldenberg wrote: > Release of Spark: 1.5.0. > > Command line invokation: > >

ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Dmitry Goldenberg
We're seeing this occasionally. Granted, this was caused by a wrinkle in the Solr schema but this bubbled up all the way in Spark and caused job failures. I just checked and SolrException class is actually in the consumer job jar we use. Is there any reason why Spark cannot find the

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Divya Ravichandran
This could be because org.apache.solr.common.SolrException doesn't implement Serializable. This error shows up when Spark is deserilizing a class which doesn't implement Serializable. Thanks Divya On Sep 29, 2015 4:37 PM, "Dmitry Goldenberg" wrote: > We're seeing this

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Dmitry Goldenberg
I'm actually not sure how either one of these would possibly cause Spark to find SolrException. Whether the driver or executor class path is first, should it not matter, if the class is in the consumer job jar? On Tue, Sep 29, 2015 at 9:12 PM, Dmitry Goldenberg

Re: ThrowableSerializationWrapper: Task exception could not be deserialized / ClassNotFoundException: org.apache.solr.common.SolrException

2015-09-29 Thread Dmitry Goldenberg
Ted, I think I have tried these settings with the hbase protocol jar, to no avail. I'm going to see if I can try and use these with this SolrException issue though it now may be harder to reproduce it. Thanks for the suggestion. On Tue, Sep 29, 2015 at 8:03 PM, Ted Yu

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-18 Thread Vipul Rai
Hi Nick/Igor, ​​ Any solution for this ? Even I am having the same issue and copying jar to each executor is not feasible if we use lot of jars. Thanks, Vipul

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Igor Berman
ough Yarn. > Serialization is set to use Kryo. > > I have a large object which I send to the executors as a Broadcast. The > object seems to serialize just fine. When it attempts to deserialize, > though, Kryo throws a ClassNotFoundException... for a class that I include > in the

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Nicholas R. Peterson
send to the executors as a Broadcast. The >> object seems to serialize just fine. When it attempts to deserialize, >> though, Kryo throws a ClassNotFoundException... for a class that I include >> in the fat jar that I spark-submit. >> >> What could be causing this

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Nick Peterson
pp__.jar and __spark__.jar. The >>>>> directory itself is on the classpath, and __spark__.jar and >>>>> __hadoop_conf__ >>>>> are as well. When I do everything the same but switch the master to >>>>> local[*], the jar I submit IS added to t

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Nicholas R. Peterson
;>>> >>>> On 8 September 2015 at 06:38, Nicholas R. Peterson < >>>> nrpeter...@gmail.com> wrote: >>>> >>>>> I'm trying to run a Spark 1.4.1 job on my CDH5.4 cluster, through >>>>> Yarn. Serialization is set to use Kryo.

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Igor Berman
;>>> >>>> This seems like a likely culprit. What could cause this, and how can I >>>> fix it? >>>> >>>> Best, >>>> Nick >>>> >>>> On Tue, Sep 8, 2015 at 1:14 AM Igor Berman <igor.ber...@gmail.com&

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Igor Berman
stack trace when it finishes. >>>>>> >>>>>> In the mean time, I've noticed something interesting: in the Spark >>>>>> UI, the application jar that I submit is not being included on the >>>>>> classpath. It has been successful

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Igor Berman
gt; at >>>>> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136) >>>>> ... 47 more >>>>> >>>>> >>>>> >>>>>> On Tue, Sep 8, 2015 at 6:01 AM Igor Berman <igor.ber.

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Nick Peterson
t;>>>> On 8 September 2015 at 15:43, Nicholas R. Peterson < >>>>>> nrpeter...@gmail.com> wrote: >>>>>> >>>>>>> Thans, Igor; I've got it running again right now, and can attach the >>>>>>> stack trace when it fi

Re: Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-08 Thread Nick Peterson
>>>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) >>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357) >>>>>> at java.lang.Class.forName0(Native Method) >>>>>> at java.lang.Class.forName(Class.java:348)

Spark on Yarn: Kryo throws ClassNotFoundException for class included in fat jar

2015-09-07 Thread Nicholas R. Peterson
I'm trying to run a Spark 1.4.1 job on my CDH5.4 cluster, through Yarn. Serialization is set to use Kryo. I have a large object which I send to the executors as a Broadcast. The object seems to serialize just fine. When it attempts to deserialize, though, Kryo throws a ClassNotFoundException

Strange ClassNotFoundException in spark-shell

2015-08-24 Thread Jan Algermissen
Hi, I am using spark 1.4 M1 with the Cassandra Connector and run into a strange error when using the spark shell. This works: sc.cassandraTable(events, bid_events).select(bid,type).take(10).foreach(println) But as soon as I put a map() in there (or filter): sc.cassandraTable(events,

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
? -- If you reply to this email, your message will be added to the discussion below: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html To unsubscribe from log4j custom appender ClassNotFoundException with spark

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
-ClassNotFoundException-with-spark-1-4-1-tp24159p24171.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24168.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
(MutableURLClassLoader) Anyone has a workaround to make this work in 1.4.1? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html Sent from the Apache Spark User List mailing list archive at Nabble.com

How to use KryoSerializer : ClassNotFoundException

2015-06-24 Thread pth001
Hi, I am using spark 1.4. I wanted to serialize by KryoSerializer, but got ClassNotFoundException. The configuration and exception is below. When I submitted the job, I also provided --jars mylib.jar which contains WRFVariableZ. conf.set(spark.serializer

Running spark1.4 inside intellij idea HttpServletResponse - ClassNotFoundException

2015-06-15 Thread Wwh 吴
name := SparkLeaning version := 1.0 scalaVersion := 2.10.4 //scalaVersion := 2.11.2 libraryDependencies ++= Seq( //org.apache.hive% hive-jdbc % 0.13.0 //io.spray % spray-can % 1.3.1, //io.spray % spray-routing % 1.3.1, io.spray % spray-testkit % 1.3.1 % test, io.spray %% spray-json %

Re: Running spark1.4 inside intellij idea HttpServletResponse - ClassNotFoundException

2015-06-15 Thread Tarek Auel
Hey, I had some similar issues in the past when I used Java 8. Are you using Java 7 or 8. (it's just an idea, because I had a similar issue) On Mon 15 Jun 2015 at 6:52 am Wwh 吴 wwyando...@hotmail.com wrote: name := SparkLeaning version := 1.0 scalaVersion := 2.10.4 //scalaVersion := 2.11.2

Re: Apache Phoenix (4.3.1 and 4.4.0-HBase-0.98) on Spark 1.3.1 ClassNotFoundException

2015-06-11 Thread Josh Mahonin
on the sandbox in standalone mode. Phoenix only supports Spark from 4.4.0 onwards, but I thought I could use a naive implementation that creates a new connection for every RDD from the DStream in 4.3.1. This resulted in the ClassNotFoundException described in [1], so I

Re: Apache Phoenix (4.3.1 and 4.4.0-HBase-0.98) on Spark 1.3.1 ClassNotFoundException

2015-06-11 Thread Jeroen Vlek
in the ClassNotFoundException described in [1], so I switched to 4.4.0. Unfortunately the saveToPhoenix method is only available in Scala. So I did find the suggestion to try it via the saveAsNewHadoopApiFile method [2] and an example implementation [3], which I adapted to my own needs

Re: Apache Phoenix (4.3.1 and 4.4.0-HBase-0.98) on Spark 1.3.1 ClassNotFoundException

2015-06-10 Thread Josh Mahonin
in the ClassNotFoundException described in [1], so I switched to 4.4.0. Unfortunately the saveToPhoenix method is only available in Scala. So I did find the suggestion to try it via the saveAsNewHadoopApiFile method [2] and an example implementation [3], which I adapted to my own needs

Re: Apache Phoenix (4.3.1 and 4.4.0-HBase-0.98) on Spark 1.3.1 ClassNotFoundException

2015-06-10 Thread Jeroen Vlek
thought I could use a naive implementation that creates a new connection for every RDD from the DStream in 4.3.1. This resulted in the ClassNotFoundException described in [1], so I switched to 4.4.0. Unfortunately the saveToPhoenix method is only available in Scala. So I did find

  1   2   >