Re: spark-3.2.2-bin-without-hadoop : NoClassDefFoundError: org/apache/log4j/spi/Filter when starting the master

2022-08-24 Thread Sean Owen
You have to provide your own Hadoop distro and all its dependencies. This build is intended for use on a Hadoop cluster, really. If you're running stand-alone, you should not be using it. Use a 'normal' distribution that bundles Hadoop libs. On Wed, Aug 24, 2022 at 9:35 AM FLORANCE Grégory

spark-3.2.2-bin-without-hadoop : NoClassDefFoundError: org/apache/log4j/spi/Filter when starting the master

2022-08-24 Thread FLORANCE Grégory
Hi, I've downloaded the 3.2.2-without Hadoop Spark distribution in order to test it in a without Hadoop context. I tested the version with Hadoop and it worked well. When I wanted to start the master, I

Odd NoClassDefFoundError exception

2021-01-26 Thread Lavelle, Shawn
Hello Spark Community, I have a Spark-SQL problem where I am receiving a NoClassDefFoundError error for org.apache.spark.sql.catalyst.util.RebaseDateTime$ . This happens for any query with a filter on a Timestamp column when the query is first run programmatically but not when the query

Re: NoClassDefFoundError: scala/Product$class

2020-06-07 Thread charles_cai
The org.bdgenomics.adam is one of the Components of the GATK, and I just download the release version from its github website . However, when I build a new docker image with spark2.4.5 and scala 2.12.4,It works well and that makes me confused. root@master2:~# pyspark Python 2.7.17 (default,

Re: NoClassDefFoundError: scala/Product$class

2020-06-06 Thread James Moore
How are you depending on that org.bdgenomics.adam library? Maybe you're pulling the 2.11 version of that.

Re: NoClassDefFoundError: scala/Product$class

2020-06-06 Thread Sean Owen
Spark 3 supports only Scala 2.12. This actually sounds like third party library is compiled for 2.11 or something. On Fri, Jun 5, 2020 at 11:11 PM charles_cai <1620075...@qq.com> wrote: > Hi Pol, > > thanks for your suggestion, I am going to use Spark-3.0.0 for GPU > acceleration,so I update the

Re: NoClassDefFoundError: scala/Product$class

2020-06-05 Thread charles_cai
Hi Pol, thanks for your suggestion, I am going to use Spark-3.0.0 for GPU acceleration,so I update the scala to the *version 2.12.11* and the latest *2.13* ,but the error is still there, and by the way , the Spark version is *spark-3.0.0-preview2-bin-without-hadoop* Caused by:

Re: NoClassDefFoundError: scala/Product$class

2020-06-03 Thread Pol Santamaria
> Hi, > > I run the GATK MarkDuplicates in Spark mode and it throws an > *NoClassDefFoundError: scala/Product$class*. The GATK version is 4.1.7 and > 4.0.0,the environment is: spark-3.0.0, scala-2.11.12 > > *GATK commands:* > > gatk MarkDuplicatesSpark \ > -I h

NoClassDefFoundError: scala/Product$class

2020-06-02 Thread charles_cai
Hi, I run the GATK MarkDuplicates in Spark mode and it throws an *NoClassDefFoundError: scala/Product$class*. The GATK version is 4.1.7 and 4.0.0,the environment is: spark-3.0.0, scala-2.11.12 *GATK commands:* gatk MarkDuplicatesSpark \ -I hdfs://master2:9000/Drosophila/output

NoClassDefFoundError

2019-05-21 Thread Sachit Murarka
Hi All, I have simply added exception handling in my code in Scala. I am getting NoClassDefFoundError . Any leads would be appreciated. Thanks Kind Regards, Sachit Murarka

java-lang-noclassdeffounderror-org-apache-spark-streaming-api-java-javastreamin

2017-02-09 Thread sathyanarayanan mudhaliyar
Error in the highlighted line. Code, error and pom.xml included below code : final Session session = connector.openSession(); final PreparedStatement prepared = session.prepare("INSERT INTO spark_test5.messages JSON?"); JavaStreamingContext ssc = new

Re: [Spark Streaming] NoClassDefFoundError : StateSpec

2017-01-12 Thread Shixiong(Ryan) Zhu
onf > "spark.driver.extraJavaOptions=-XX:MaxPermSize=6G -XX:+UseConcMarkSweepGC" > --conf "spark.executor.extraJavaOptions=-XX:+UseConcMarkSweepGC -verbose:gc > -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" --class MY_DRIVER > ~/project-assembly-0.0.1-SNAPSHOT.j

[Spark Streaming] NoClassDefFoundError : StateSpec

2017-01-12 Thread Ramkumar Venkataraman
ize=6G -XX:+UseConcMarkSweepGC" --conf "spark.executor.extraJavaOptions=-XX:+UseConcMarkSweepGC -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" --class MY_DRIVER ~/project-assembly-0.0.1-SNAPSHOT.jar == Is there anything I am missing here? I understand that NoClass

Re: NoClassDefFoundError

2016-12-21 Thread Vadim Semenov
LContext class. my maven project doesn't have any problem during > compile and packaging phase. but when I send .jar of project to sjs and run > it "NoClassDefFoundError" will be issued. the trace of exception is : > > > job-server[ERROR] Exception in thread "pool-20-thread

NoClassDefFoundError

2016-12-21 Thread Reza zade
Hello I've extended the JavaSparkJob (job-server-0.6.2) and created an object of SQLContext class. my maven project doesn't have any problem during compile and packaging phase. but when I send .jar of project to sjs and run it "NoClassDefFoundError" will be issued. the trace of

RE: submitting a spark job using yarn-client and getting NoClassDefFoundError: org/apache/spark/Logging

2016-11-16 Thread David Robison
I’ve gotten a little further along. It now submits the job via Yarn, but now the jobs exit immediately with the following error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging at java.lang.ClassLoader.defineClass1(Native Method) at

Re: NoClassDefFoundError: org/apache/spark/Logging in SparkSession.getOrCreate

2016-10-17 Thread Saisai Shao
Not sure why your code will search Logging class under org/apache/spark, this should be “org/apache/spark/internal/Logging”, and it changed long time ago. On Sun, Oct 16, 2016 at 3:25 AM, Brad Cox wrote: > I'm experimenting with Spark 2.0.1 for the first time and hitting a

NoClassDefFoundError: org/apache/spark/Logging in SparkSession.getOrCreate

2016-10-15 Thread Brad Cox
I'm experimenting with Spark 2.0.1 for the first time and hitting a problem right out of the gate. My main routine starts with this which I think is the standard idiom. SparkSession sparkSession = SparkSession .builder()

Re: Loading data into Hbase table throws NoClassDefFoundError: org/apache/htrace/Trace error

2016-10-02 Thread Mich Talebzadeh
Thanks Ben The thing is I am using Spark 2 and no stack from CDH! Is this approach to reading/writing to Hbase specific to Cloudera? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Loading data into Hbase table throws NoClassDefFoundError: org/apache/htrace/Trace error

2016-10-01 Thread Benjamin Kim
Mich, I know up until CDH 5.4 we had to add the HTrace jar to the classpath to make it work using the command below. But after upgrading to CDH 5.7, it became unnecessary. echo "/opt/cloudera/parcels/CDH/jars/htrace-core-3.2.0-incubating.jar" >> /etc/spark/conf/classpath.txt Hope this helps.

Loading data into Hbase table throws NoClassDefFoundError: org/apache/htrace/Trace error

2016-10-01 Thread Mich Talebzadeh
Trying bulk load using Hfiles in Spark as below example: import org.apache.spark._ import org.apache.spark.rdd.NewHadoopRDD import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor} import org.apache.hadoop.hbase.client.HBaseAdmin import

Re: NoClassDefFoundError with ZonedDateTime

2016-07-24 Thread Timur Shenkao
Which version of Java 8 do you use? AFAIK, it's recommended to exploit Java 1.8_0.66 + On Fri, Jul 22, 2016 at 8:49 PM, Jacek Laskowski wrote: > On Fri, Jul 22, 2016 at 6:43 AM, Ted Yu wrote: > > You can use this command (assuming log aggregation is turned

Re: NoClassDefFoundError with ZonedDateTime

2016-07-22 Thread Jacek Laskowski
On Fri, Jul 22, 2016 at 6:43 AM, Ted Yu wrote: > You can use this command (assuming log aggregation is turned on): > > yarn logs --applicationId XX I don't think it's gonna work for already-running application (and I wish I were mistaken since I needed it just yesterday) and

Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ted Yu
ay to get the Classpath for the spark application > itself? > > On Thu, Jul 21, 2016 at 9:37 PM Ted Yu <yuzhih...@gmail.com> wrote: > >> Might be classpath issue. >> >> Mind pastebin'ning the effective class path ? >> >> Stack trace of NoClassDefFoundError ma

Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ilya Ganelin
what's the easiest way to get the Classpath for the spark application itself? On Thu, Jul 21, 2016 at 9:37 PM Ted Yu <yuzhih...@gmail.com> wrote: > Might be classpath issue. > > Mind pastebin'ning the effective class path ? > > Stack trace of NoClassDefFoundError may also h

Re: NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ted Yu
Might be classpath issue. Mind pastebin'ning the effective class path ? Stack trace of NoClassDefFoundError may also help provide some clue. On Thu, Jul 21, 2016 at 8:26 PM, Ilya Ganelin <ilgan...@gmail.com> wrote: > Hello - I'm trying to deploy the Spark TimeSeries library

NoClassDefFoundError with ZonedDateTime

2016-07-21 Thread Ilya Ganelin
Hello - I'm trying to deploy the Spark TimeSeries library in a new environment. I'm running Spark 1.6.1 submitted through YARN in a cluster with Java 8 installed on all nodes but I'm getting the NoClassDef at runtime when trying to create a new TimeSeriesRDD. Since ZonedDateTime is part of Java 8

Re: spark with breeze error of NoClassDefFoundError

2015-11-19 Thread Ted Yu
*To:* Ted Yu > *Cc:* Jack Yang; Fengdong Yu; user@spark.apache.org > > *Subject:* Re: spark with breeze error of NoClassDefFoundError > > > > Dear Ted, > > I just looked at the link you provided, it is great! > > > > For my understanding, I could also dire

Re: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Fengdong Yu
> > > From: Ted Yu [mailto:yuzhih...@gmail.com] > Sent: Wednesday, 18 November 2015 4:01 PM > To: Jack Yang > Cc: user@spark.apache.org > Subject: Re: spark with breeze error of NoClassDefFoundError > > Looking in local maven repo, breeze_2.10-0.7.jar contains De

Re: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread 金国栋
LClassLoader.findClass(URLClassLoader.java:354) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > > ... 10 more > > 15/11/18 17:15:15 INFO util.Utils: Shutdown hook called > &

RE: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Jack Yang
ubject: Re: spark with breeze error of NoClassDefFoundError The simplest way is remove all “provided” in your pom. then ‘sbt assembly” to build your final package. then get rid of ‘—jars’ because assembly already includes all dependencies. On Nov 18, 2015, at 2:15 PM, Jack Yang <j...@uow.ed

Re: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Zhiliang Zhu
ser@spark.apache.org Subject: Re: spark with breeze error of NoClassDefFoundError   The simplest way is remove all “provided” in your pom.   then ‘sbt assembly” to build your final package. then get rid of ‘—jars’ because assembly already includes all dependencies.             On Nov 18,

Re: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Zhiliang Zhu
ith breeze error of NoClassDefFoundError   The simplest way is remove all “provided” in your pom.   then ‘sbt assembly” to build your final package. then get rid of ‘—jars’ because assembly already includes all dependencies.             On Nov 18, 2015, at 2:15 PM, Jack Yang <j...@uow.edu.au&

Re: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Ted Yu
/11/19 10:28:29 INFO util.Utils: Shutdown hook called > > Meanwhile, I will prefer to use maven to compile the jar file rather than > sbt, although it is indeed another option. > > Best regards, > Jack > > > > From: Fengdong Yu [mailto:fengdo...@everstring.com

RE: spark with breeze error of NoClassDefFoundError

2015-11-18 Thread Jack Yang
Yu; user@spark.apache.org Subject: Re: spark with breeze error of NoClassDefFoundError Dear Ted, I just looked at the link you provided, it is great! For my understanding, I could also directly use other Breeze part (except spark mllib package linalg ) in spark (scala or java ) program after

Re: spark with breeze error of NoClassDefFoundError

2015-11-17 Thread Ted Yu
Looking in local maven repo, breeze_2.10-0.7.jar contains DefaultArrayValue : jar tvf /Users/tyu/.m2/repository//org/scalanlp/breeze_2.10/0.7/breeze_2.10-0.7.jar | grep !$ jar tvf /Users/tyu/.m2/repository//org/scalanlp/breeze_2.10/0.7/breeze_2.10-0.7.jar | grep DefaultArrayValue 369 Wed Mar

RE: spark with breeze error of NoClassDefFoundError

2015-11-17 Thread Jack Yang
Re: spark with breeze error of NoClassDefFoundError Looking in local maven repo, breeze_2.10-0.7.jar contains DefaultArrayValue : jar tvf /Users/tyu/.m2/repository//org/scalanlp/breeze_2.10/0.7/breeze_2.10-0.7.jar | grep !$ jar tvf /Users/tyu/.m2/repository//org/scalanlp/breeze_2.10/0.7/breeze_2.

Re: NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2015-07-29 Thread Ted Yu
using scala 2.10.4, and spark was compiled against scala 2.10.x. Perhaps I’m missing something here. Also, the NoClassDefFoundError presents itself when debugging in eclipse, but running directly via the jar, the following error appears: Exception in thread main

NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2015-07-29 Thread Benjamin Ross
version that I'm using. However, that's not the case for me. I'm using scala 2.10.4, and spark was compiled against scala 2.10.x. Perhaps I'm missing something here. Also, the NoClassDefFoundError presents itself when debugging in eclipse, but running directly via the jar, the following

RE: NoClassDefFoundError: scala/collection/GenTraversableOnce$class

2015-07-29 Thread Benjamin Ross
@spark.apache.org Subject: Re: NoClassDefFoundError: scala/collection/GenTraversableOnce$class You can generate dependency tree using: mvn dependency:tree and grep for 'org.scala-lang' in the output to see if there is any clue. Cheers On Wed, Jul 29, 2015 at 5:14 PM, Benjamin Ross br...@lattice

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-04 Thread Cheng Lian
I think this is a bug of Spark SQL dates back to at least 1.1.0. The json_tuple function is implemented as org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple. The ClassNotFoundException should complain with the class name rather than the UDTF function name. The problematic line

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-04 Thread Cheng Lian
Filed https://issues.apache.org/jira/browse/SPARK-6708 to track this. Cheng On 4/4/15 10:21 PM, Cheng Lian wrote: I think this is a bug of Spark SQL dates back to at least 1.1.0. The json_tuple function is implemented as org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple. The

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Akhil Das
How did you build spark? which version of spark are you having? Doesn't this thread already explains it? https://www.mail-archive.com/user@spark.apache.org/msg25505.html Thanks Best Regards On Thu, Apr 2, 2015 at 11:10 PM, Todd Nist tsind...@gmail.com wrote: Hi Akhil, Tried your suggestion

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
I placed it there. It was downloaded from MySql site. On Fri, Apr 3, 2015 at 6:25 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote: Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there ? 2) What

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
Hi Deepujain, I did include the jar file, I believe it is hive-exe.jar, through the --jars option: ./bin/spark-shell --master spark://radtech.io:7077 --total-executor-cores 2 --driver-class-path /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar --jars

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread ๏̯͡๏
I think you need to include the jar file through --jars option that contains the hive definition (code) of UDF json_tuple. That should solve your problem. On Fri, Apr 3, 2015 at 3:57 PM, Todd Nist tsind...@gmail.com wrote: I placed it there. It was downloaded from MySql site. On Fri, Apr 3,

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Todd Nist
Started the spark shell with the one jar from hive suggested: ./bin/spark-shell --master spark://radtech.io:7077 --total-executor-cores 2 --driver-class-path /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar --jars /opt/apache-hive-0.13.1-bin/lib/hive-exec-0.13.1.jar Results in the same

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread Akhil Das
Copy pasted his command in the same thread. Thanks Best Regards On Fri, Apr 3, 2015 at 3:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote: Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-03 Thread ๏̯͡๏
Akhil you mentioned /usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jar . how come you got this lib into spark/lib folder. 1) did you place it there ? 2) What is download location ? On Fri, Apr 3, 2015 at 3:42 PM, Todd Nist tsind...@gmail.com wrote: Started the spark shell with the one

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-02 Thread Akhil Das
Try adding all the jars in your $HIVE/lib directory. If you want the specific jar, you could look fr jackson or json serde in it. Thanks Best Regards On Thu, Apr 2, 2015 at 12:49 AM, Todd Nist tsind...@gmail.com wrote: I have a feeling I’m missing a Jar that provides the support or could this

Re: Spark Sql - Missing Jar ? json_tuple NoClassDefFoundError

2015-04-02 Thread Todd Nist
Hi Akhil, Tried your suggestion to no avail. I actually to not see and jackson or json serde jars in the $HIVE/lib directory. This is hive 0.13.1 and spark 1.2.1 Here is what I did: I have added the lib folder to the –jars option when starting the spark-shell, but the job fails. The

Spark error NoClassDefFoundError: org/apache/hadoop/mapred/InputSplit

2015-03-23 Thread , Roy
Hi, I am using CDH 5.3.2 packages installation through Cloudera Manager 5.3.2 I am trying to run one spark job with following command PYTHONPATH=~/code/utils/ spark-submit --master yarn --executor-memory 3G --num-executors 30 --driver-memory 2G --executor-cores 2 --name=analytics

Re: Spark error NoClassDefFoundError: org/apache/hadoop/mapred/InputSplit

2015-03-23 Thread Ted Yu
InputSplit is in hadoop-mapreduce-client-core jar Please check that the jar is in your classpath. Cheers On Mon, Mar 23, 2015 at 8:10 AM, , Roy rp...@njit.edu wrote: Hi, I am using CDH 5.3.2 packages installation through Cloudera Manager 5.3.2 I am trying to run one spark job with

Re: NoClassDefFoundError when trying to run spark application

2015-01-02 Thread Pankaj Narang
do you assemble the uber jar ? you can use sbt assembly to build the jar and then run. It should fix the issue -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NoClassDefFoundError-when-trying-to-run-spark-application-tp20707p20944.html Sent from the Apache

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-18 Thread Sean Owen
Adding a hadoop-2.6 profile is not necessary. Use hadoop-2.4, which already exists and is intended for 2.4+. In fact this declaration is missing things that Hadoop 2 needs. On Thu, Dec 18, 2014 at 3:46 AM, Kyle Lin kylelin2...@gmail.com wrote: Hi there The following is my steps. And got the

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Nicholas Chammas
Spark 1.3 does not exist. Spark 1.2 hasn't been released just yet. Which version of Spark did you mean? Also, from what I can see in the docs http://spark.apache.org/docs/1.1.1/building-with-maven.html#specifying-the-hadoop-version, I believe the latest version of Hadoop that Spark supports is

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Sean Owen
Spark works fine with 2.4 *and later*. The docs don't mean to imply 2.4 is the last supported version. On Wed, Dec 17, 2014 at 10:19 AM, Nicholas Chammas nicholas.cham...@gmail.com wrote: Spark 1.3 does not exist. Spark 1.2 hasn't been released just yet. Which version of Spark did you mean?

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Nicholas Chammas
Thanks for the correction, Sean. Do the docs need to be updated on this point, or is it safer for now just to note 2.4 specifically? On Wed Dec 17 2014 at 5:54:53 AM Sean Owen so...@cloudera.com wrote: Spark works fine with 2.4 *and later*. The docs don't mean to imply 2.4 is the last

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Daniel Haviv
Thanks for your replies. I was building spark from trunk. Daniel On 17 בדצמ׳ 2014, at 19:49, Nicholas Chammas nicholas.cham...@gmail.com wrote: Thanks for the correction, Sean. Do the docs need to be updated on this point, or is it safer for now just to note 2.4 specifically? On Wed

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Kyle Lin
Hi there The following is my steps. And got the same exception with Daniel's. Another question: how can I build a tgz file like the pre-build file I download from official website? 1. download trunk from git. 2. add following lines in pom.xml + profile + idhadoop-2.6/id +

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-16 Thread Kyle Lin
I also got the same problem.. 2014-12-09 22:58 GMT+08:00 Daniel Haviv danielru...@gmail.com: Hi, I've built spark 1.3 with hadoop 2.6 but when I startup the spark-shell I get the following exception: 14/12/09 06:54:24 INFO server.AbstractConnector: Started

Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-09 Thread Daniel Haviv
Hi, I've built spark 1.3 with hadoop 2.6 but when I startup the spark-shell I get the following exception: 14/12/09 06:54:24 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 14/12/09 06:54:24 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 14/12/09

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-09 Thread Judy Nash
...@cloudera.com] Sent: Tuesday, December 2, 2014 11:35 AM To: Judy Nash Cc: Patrick Wendell; Denny Lee; Cheng Lian; u...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava On Tue, Dec 2, 2014 at 11:22 AM, Judy Nash judyn...@exchange.microsoft.com

NoClassDefFoundError

2014-12-07 Thread Julius K
Hi everyone, I am new to Spark and encountered a problem. I want to use an external library in a java project and compiling works fine with maven, but during runtime (locally) I get a NoClassDefFoundError. Do I have to put the jars somewhere, or tell spark where they are? I can send the pom.xml

Re: NoClassDefFoundError

2014-12-07 Thread Ted Yu
project and compiling works fine with maven, but during runtime (locally) I get a NoClassDefFoundError. Do I have to put the jars somewhere, or tell spark where they are? I can send the pom.xml and my imports or source code, if this helps you. Best regards Julius Kolbe

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-02 Thread Judy Nash
thrift server fail with NoClassDefFoundError on Guava Thanks Judy. While this is not directly caused by a Spark issue, it is likely other users will run into this. This is an unfortunate consequence of the way that we've shaded Guava in this release, we rely on byte code shading of Hadoop itself

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-12-02 Thread Marcelo Vanzin
with NoClassDefFoundError on Guava Thanks Judy. While this is not directly caused by a Spark issue, it is likely other users will run into this. This is an unfortunate consequence of the way that we've shaded Guava in this release, we rely on byte code shading of Hadoop itself as well

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-30 Thread Judy Nash
- From: Patrick Wendell [mailto:pwend...@gmail.com] Sent: Wednesday, November 26, 2014 8:17 AM To: Judy Nash Cc: Denny Lee; Cheng Lian; u...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Just to double check - I looked at our own

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-30 Thread Patrick Wendell
: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Just to double check - I looked at our own assembly jar and I confirmed that our Hadoop configuration class does use the correctly shaded version of Guava. My best guess here is that somehow a separate Hadoop library

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-26 Thread Patrick Wendell
:47 PM To: Judy Nash; Cheng Lian; u...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava To determine if this is a Windows vs. other configuration, can you just try to call the Spark-class.cmd SparkSubmit without actually

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-26 Thread Patrick Wendell
; Cheng Lian; u...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava To determine if this is a Windows vs. other configuration, can you just try to call the Spark-class.cmd SparkSubmit without actually referencing the Hadoop

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
, November 24, 2014 11:50 PM To: Cheng Lian; u...@spark.incubator.apache.org Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava This is what I got from jar tf: org/spark-project/guava/common/base/Preconditions.class org/spark-project/guava/common/math

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
AM To: Judy Nash; u...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Oh so you're using Windows. What command are you using to start the Thrift server then? On 11/25/14 4:25 PM, Judy Nash wrote: Made progress but still blocked. After

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Denny Lee
Nash; u...@spark.incubator.apache.org *Subject:* Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Oh so you're using Windows. What command are you using to start the Thrift server then? On 11/25/14 4:25 PM, Judy Nash wrote: Made progress but still blocked

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-25 Thread Judy Nash
with NoClassDefFoundError on Guava To determine if this is a Windows vs. other configuration, can you just try to call the Spark-class.cmd SparkSubmit without actually referencing the Hadoop or Thrift server classes? On Tue Nov 25 2014 at 5:42:09 PM Judy Nash judyn

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-24 Thread Cheng Lian
SparkContext unsuccessfully. Let me know if you need anything else. *From:*Cheng Lian [mailto:lian.cs@gmail.com] *Sent:* Friday, November 21, 2014 8:02 PM *To:* Judy Nash; u...@spark.incubator.apache.org *Subject:* Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Hi

RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-24 Thread Judy Nash
...@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Hm, I tried exactly the same commit and the build command locally, but couldn’t reproduce this. Usually this kind of errors are caused by classpath misconfiguration. Could you please try

latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-21 Thread Judy Nash
Hi, Thrift server is failing to start for me on latest spark 1.2 branch. I got the error below when I start thrift server. Exception in thread main java.lang.NoClassDefFoundError: com/google/common/bas e/Preconditions at org.apache.hadoop.conf.Configuration$DeprecationDelta.init(Configur

Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava

2014-11-21 Thread Cheng Lian
Hi Judy, could you please provide the commit SHA1 of the version you're using? Thanks! On 11/22/14 11:05 AM, Judy Nash wrote: Hi, Thrift server is failing to start for me on latest spark 1.2 branch. I got the error below when I start thrift server. Exception in thread main

NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Terry Siu
I just built the 1.2 snapshot current as of commit 76386e1a23c using: $ ./make-distribution.sh —tgz —name my-spark —skip-java-test -DskipTests -Phadoop-2.4 -Phive -Phive-0.13.1 -Pyarn I drop in my Hive configuration files into the conf directory, launch spark-shell, and then create my

Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Kousuke Saruta
Hi Terry I think the issue you mentioned will be resolved by following PR. https://github.com/apache/spark/pull/3072 - Kousuke (2014/11/03 10:42), Terry Siu wrote: I just built the 1.2 snapshot current as of commit 76386e1a23c using: $ ./make-distribution.sh —tgz —name my-spark

Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Terry Siu
@spark.apache.orgmailto:user@spark.apache.org user@spark.apache.orgmailto:user@spark.apache.org Subject: Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile Hi Terry I think the issue you mentioned will be resolved by following PR. https://github.com/apache/spark

Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile

2014-11-03 Thread Michael Armbrust
terry@smartfocus.com, user@spark.apache.org user@spark.apache.org Subject: Re: NoClassDefFoundError encountered in Spark 1.2-snapshot build with hive-0.13.1 profile Hi Terry I think the issue you mentioned will be resolved by following PR. https://github.com/apache/spark/pull/3072

spark-submit results in NoClassDefFoundError

2014-10-29 Thread Tobias Pfeiffer
Hi, I am trying to get my Spark application to run on YARN and by now I have managed to build a fat jar as described on http://markmail.org/message/c6no2nyaqjdujnkq (which is the only really usable manual on how to get such a jar file). My code runs fine using sbt test and sbt run, but when

Re: spark-submit results in NoClassDefFoundError

2014-10-29 Thread Tobias Pfeiffer
Hi again, On Thu, Oct 30, 2014 at 11:50 AM, Tobias Pfeiffer t...@preferred.jp wrote: Spark assembly has been built with Hive, including Datanucleus jars on classpath Exception in thread main java.lang.NoClassDefFoundError: com/typesafe/scalalogging/slf4j/Logger It turned out scalalogging

Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-28 Thread Stephen Boesch
I had an offline with Akhil, but this issue is still not resolved. 2014-10-24 0:18 GMT-07:00 Akhil Das ak...@sigmoidanalytics.com: Make sure the guava jar http://mvnrepository.com/artifact/com.google.guava/guava/12.0 is present in the classpath. Thanks Best Regards On Thu, Oct 23, 2014

Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-28 Thread Stephen Boesch
I have checked out from master, cleaned/rebuilt on command line in maven, then cleaned/rebuilt in intellij many times. This error persists through it all. Anyone have a solution? 2014-10-23 1:43 GMT-07:00 Stephen Boesch java...@gmail.com: After having checked out from master/head the

Re: NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-24 Thread Akhil Das
Make sure the guava jar http://mvnrepository.com/artifact/com.google.guava/guava/12.0 is present in the classpath. Thanks Best Regards On Thu, Oct 23, 2014 at 2:13 PM, Stephen Boesch java...@gmail.com wrote: After having checked out from master/head the following error occurs when attempting

NoClassDefFoundError on ThreadFactoryBuilder in Intellij

2014-10-23 Thread Stephen Boesch
After having checked out from master/head the following error occurs when attempting to run any test in Intellij Exception in thread main java.lang.NoClassDefFoundError: com/google/common/util/concurrent/ThreadFactoryBuilder at org.apache.spark.util.Utils$.init(Utils.scala:648) There appears to

Re: Spark SQL + Hive + JobConf NoClassDefFoundError

2014-10-01 Thread Patrick McGloin
FYI, in case anybody else has this problem, we switched to Spark 1.1 (outside CDH) and the same Spark application worked first time (once recompiled with Spark 1.1 libs of course). I assume this is because Spark 1.1 is compiled with Hive. On 29 September 2014 17:41, Patrick McGloin

Spark SQL + Hive + JobConf NoClassDefFoundError

2014-09-29 Thread Patrick McGloin
Hi, I have an error when submitting a Spark SQL application to our Spark cluster: 14/09/29 16:02:11 WARN scheduler.TaskSetManager: Loss was due to java.lang.NoClassDefFoundError *java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobConf* at

Re: NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass with spark-submit

2014-08-08 Thread Nick Pentreath
By the way, for anyone using elasticsearch-hadoop, there is a fix for this here: https://github.com/elasticsearch/elasticsearch-hadoop/issues/239 Ryan - using the nightly snapshot build of 2.1.0.BUILD-SNAPSHOT fixed this for me. On Thu, Aug 7, 2014 at 3:58 PM, Nick Pentreath

Re: NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass with spark-submit

2014-08-07 Thread Nick Pentreath
I'm also getting this - Ryan we both seem to be running into this issue with elasticsearch-hadoop :) I tried spark.files.userClassPathFirst true on command line and that doesn;t work If I put it that line in spark/conf/spark-defaults it works but now I'm getting: java.lang.NoClassDefFoundError:

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-13 Thread Tathagata Das
In case you still have issues with duplicate files in uber jar, here is a reference sbt file with assembly plugin that deals with duplicates https://github.com/databricks/training/blob/sparkSummit2014/streaming/scala/build.sbt On Fri, Jul 11, 2014 at 10:06 AM, Bill Jay

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-13 Thread Tathagata Das
Alsom the reason the spark-streaming-kafka is not included in the spark assembly is that we do not want dependencies of external systems like kafka (which itself probably has a complex dependency tree) to cause conflict with the core spark's functionality and stability. TD On Sun, Jul 13, 2014

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-11 Thread Akhil Das
Easiest fix would be adding the kafka jars to the SparkContext while creating it. Thanks Best Regards On Fri, Jul 11, 2014 at 4:39 AM, Dilip dilip_ram...@hotmail.com wrote: Hi, I am trying to run a program with spark streaming using Kafka on a stand alone system. These are my details:

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-11 Thread Dilip
Hi Akhil, Can you please guide me through this? Because the code I am running already has this in it: [java] SparkContext sc = new SparkContext(); sc.addJar(/usr/local/spark/external/kafka/target/scala-2.10/spark-streaming-kafka_2.10-1.1.0-SNAPSHOT.jar); Is there something I am

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-11 Thread Bill Jay
I have met similar issues. The reason is probably because in Spark assembly, spark-streaming-kafka is not included. Currently, I am using Maven to generate a shaded package with all the dependencies. You may try to use sbt assembly to include the dependencies in your jar file. Bill On Thu, Jul

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-11 Thread Dilip
A simple sbt assembly is not working. Is there any other way to include particular jars with assembly command? Regards, Dilip On Friday 11 July 2014 12:45 PM, Bill Jay wrote: I have met similar issues. The reason is probably because in Spark assembly, spark-streaming-kafka is not

Re: Spark Streaming with Kafka NoClassDefFoundError

2014-07-11 Thread Bill Jay
You may try to use this one: https://github.com/sbt/sbt-assembly I had an issue of duplicate files in the uber jar file. But I think this library will assemble dependencies into a single jar file. Bill On Fri, Jul 11, 2014 at 1:34 AM, Dilip dilip_ram...@hotmail.com wrote: A simple sbt

  1   2   >