hi, all :
I got a strange error:
bin/spark-shell --deploy-mode client
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
22/03/21 13:51:39 WARN util.Utils: spark.executor.instances less than
I am seeing this issue when running Spark 3.0.2 on YARN.
Has a resolution been found for this? (I recentlly upgraded from using Spark
2.x on YARN)
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To
Hi, any one knows how to fix below error?
java.lang.NoSuchMethodError:
Thank you, Luciano, Shixiong.
I thought the "_2.11" part referred to the Kafka version - an
unfortunate coincidence.
Indeed
spark-submit --jars spark-streaming-kafka-assembly_2.10-1.5.2.jar
my_kafka_streaming_wordcount.py
OR
spark-submit --packages
Hello,
I am trying to set up a simple example with Spark Streaming (Python) and
Kafka on a single machine deployment.
My Kafka broker/server is also on the same machine (localhost:1281) and
I am using Spark Version: spark-1.5.2-bin-hadoop2.6
Python code
...
ssc = StreamingContext(sc,
What's the Scala version of your Spark? Is it 2.10?
Best Regards,
Shixiong Zhu
2015-12-17 10:10 GMT-08:00 Christos Mantas :
> Hello,
>
> I am trying to set up a simple example with Spark Streaming (Python) and
> Kafka on a single machine deployment.
> My Kafka
Unless you built your own Spark distribution with Scala 2_11, you want to
use the 2.10 dependency :
--packages org.apache.spark:spark-streaming-kafka_2.10:1.5.2
On Thu, Dec 17, 2015 at 10:10 AM, Christos Mantas wrote:
> Hello,
>
> I am trying to set up a simple
rebuilding spark help?
From: Fengdong Yu <fengdo...@everstring.com>
Sent: Monday, December 7, 2015 10:31 PM
To: Sunil Tripathy
Cc: user@spark.apache.org
Subject: Re: NoSuchMethodError:
com.fasterxml.jackson.databind.ObjectMapper.enable
Can you try like
Can you try like this in your sbt:
val spark_version = "1.5.2"
val excludeServletApi = ExclusionRule(organization = "javax.servlet", artifact
= "servlet-api")
val excludeEclipseJetty = ExclusionRule(organization = "org.eclipse.jetty")
libraryDependencies ++= Seq(
"org.apache.spark" %%
I am getting the following exception when I use spark-submit to submit a spark
streaming job.
Exception in thread "main" java.lang.NoSuchMethodError:
com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
Hi,
While I am trying to read a json file using SQLContext, i get the
following error:
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.sql.SQLContext.(Lorg/apache/spark/api/java/JavaSparkContext;)V
at com.honeywell.test.testhive.HiveSpark.main(HiveSpark.java:15)
The code looks good. can you check your ‘import’ in your code? because it
calls ‘honeywell.test’?
> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas wrote:
>
> Hi,
>
> While I am trying to read a json file using SQLContext, i get the
> following error:
>
> Exception in
Ignore my inputs, I think HiveSpark.java is your main method located.
can you paste the whole pom.xml and your code?
> On Nov 16, 2015, at 3:39 PM, Fengdong Yu wrote:
>
> The code looks good. can you check your ‘import’ in your code? because it
> calls
And, also make sure your scala version is 2.11 for your build.
> On Nov 16, 2015, at 3:43 PM, Fengdong Yu wrote:
>
> Ignore my inputs, I think HiveSpark.java is your main method located.
>
> can you paste the whole pom.xml and your code?
>
>
>
>
>> On Nov 16,
what’s your SQL?
> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas wrote:
>
> Hi,
>
> While I am trying to read a json file using SQLContext, i get the
> following error:
>
> Exception in thread "main" java.lang.NoSuchMethodError:
>
I am trying to just read a JSON file in SQLContext and print the
dataframe as follows:
SparkConf conf = new SparkConf().setMaster("local").setAppName("AppName");
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext = new SQLContext(sc);
DataFrame df =
For some reason you are having two different versions of spark jars in your
classpath.
Thanks
Best Regards
On Tue, Aug 4, 2015 at 12:37 PM, Deepesh Maheshwari
deepesh.maheshwar...@gmail.com wrote:
Hi,
I am trying to read data from kafka and process it using spark.
i have attached my source
Hi,
I am trying to read data from kafka and process it using spark.
i have attached my source code , error log.
For integrating kafka,
i have added dependency in pom.xml
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-streaming_2.10/artifactId
Anyone met the same problem like me?
2015-06-12 23:40 GMT+08:00 Tao Li litao.bupt...@gmail.com:
Hi all:
I complied new spark 1.4.0 version today. But when I run WordCount demo,
it throws NoSuchMethodError *java.lang.NoSuchMethodError
Hi all:
I complied new spark 1.4.0 version today. But when I run WordCount demo, it
throws NoSuchMethodError *java.lang.NoSuchMethodError:
com.fasterxml.jackson.module.scala.deser.BigDecimalDeserialize*r.
I found the default *fasterxml.jackson.version* is *2.4.4*. It's there
any wrong
I am trying to process events from a flume avro sink, but i keep getting this
same error. I am just running it locally using flumes avro-client. With the
following commands to start the job and client. It seems like it should be
a configuration problems since its a NoSuchMethodError
work in getting SBT to
work. However, at RUN TIME, I have the following output, which complains my
sqlCtx.esRDD() has a
NoSuchMethodError org.apache.spark.sql.catalyst.types.StructField
according to ElasticSearch.
This is a nightmare and I cannot get it to work, does anybody know how to
make
assembly, after much work in getting SBT to work.
However, at RUN TIME, I have the
following output, which complains my sqlCtx.esRDD() has a
NoSuchMethodError org.apache.spark.sql.catalyst.types.StructField according to
ElasticSearch.
This is a nightmare and I cannot get it to work, does anybody know
: NoSuchMethodError:
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions
NoSuchMethodError almost always means that you have compiled some code
against one version of a library but are running against another. I wonder
if you are including different versions of Spark in your
Hi
I get this exception when I run a Spark test case on my local machine:
An exception or error caused a run to abort:
NoSuchMethodError almost always means that you have compiled some code
against one version of a library but are running against another. I
wonder if you are including different versions of Spark in your
project, or running against a cluster on an older version?
On Thu, Jan 22, 2015 at 3:57 PM
: Exception: NoSuchMethodError:
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions
NoSuchMethodError almost always means that you have compiled some code against
one version of a library but are running against another. I wonder if you are
including different versions of Spark in your
: NoSuchMethodError:
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions
NoSuchMethodError almost always means that you have compiled some code against
one version of a library but are running against another. I wonder if you are
including different versions of Spark in your project
Good luck. Let me know If I can assist you further
Regards
-Pankaj
Linkedin
https://www.linkedin.com/profile/view?id=171566646
Skype
pankaj.narang
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration
pankaj.narang
---Pankaj
--
If you reply to this email, your message will be added to the discussion
below:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926p20950.html
dependencyOverrides += com.typesafe % config % 1.2.1
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
Sent from the Apache Spark User List mailing list archive
dependencyOverrides += com.typesafe % config % 1.2.1
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
(FlowMaterializer.scala:256)
I think there is version mismatch on the jars you use at runtime
If you need more help add me on skype pankaj.narang
---Pankaj
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config
% 1.2.1
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Hi,
I am intending to save the streaming data from kafka into Cassandra, using
spark-streaming:
But there seems to be problem with line
javaFunctions(data).writerBuilder(testkeyspace, test_table,
mapToRow(TestTable.class)).saveToCassandra();
I am getting NoSuchMethodError.
The code, the error
wrote:
Hi,
I am intending to save the streaming data from kafka into Cassandra, using
spark-streaming:
But there seems to be problem with line
javaFunctions(data).writerBuilder(testkeyspace, test_table,
mapToRow(TestTable.class)).saveToCassandra();
I am getting NoSuchMethodError.
The code
.
From: Gerard Maas gerard.m...@gmail.com
Sent: Tuesday, December 9, 2014 4:39 PM
To: Sarosh, M.
Cc: spark users
Subject: Re: NoSuchMethodError: writing spark-streaming data to cassandra
You're using two conflicting versions of the connector: the Scala version at
1.1.0
Hi,
I have a spark application which uses Cassandra
connectorspark-cassandra-connector-assembly-1.2.0-SNAPSHOT.jar to load
data from Cassandra into spark.
Everything works fine in the local mode, when I run in my IDE. But when I
submit the application to be executed in standalone Spark server,
Hi,
It looks like you are building from master
(spark-cassandra-connector-assembly-1.2.0).
- Append this to your com.google.guava declaration: % provided
- Be sure your version of the connector dependency is the same as the assembly
build. For instance, if you are using 1.1.0-beta1, build your
.
Sasi
--
If you reply to this email, your message will be added to the discussion
below:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-cassandra-thrift-ITransportFactory-openTransport-tp17338p17471.html
To unsubscribe from NoSuchMethodError
link. During spark-submit, we faced some JARs related issue and we
resolved them using --jars option for spark-submit. However, we stuck with
NoSuchMethodError: cassandra.thrift.ITransportFactory.openTransport()
Find enclosed image for the complete error.
http://apache-spark-user-list.1001560.n3
sample code for
connecting to cassandra using
https://github.com/datastax/spark-cassandra-connector/blob/b1.0/doc/0_quick_start.md
link. During spark-submit, we faced some JARs related issue and we
resolved them using --jars option for spark-submit. However, we stuck
with
NoSuchMethodError
with
NoSuchMethodError: cassandra.thrift.ITransportFactory.openTransport()
Find enclosed image for the complete error.
http://apache-spark-user-list.1001560.n3.nabble.com/file/n17338/Error.png
We included following JARS using --jars option for spark-submit
a) apache-cassandra-thrift-1.1.10.jar
b
For posterity's sake, I solved this. The problem was the Cloudera cluster
I was submitting to is running 1.0, and I was compiling against the latest
1.1 release. Downgrading to 1.0 on my compile got me past this.
On Tue, Oct 14, 2014 at 6:08 PM, Michael Campbell
michael.campb...@gmail.com
How did you resolve it?
On Tue, Jul 15, 2014 at 3:50 AM, SK skrishna...@gmail.com wrote:
The problem is resolved. Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9742.html
Sent from the Apache Spark User List
Hey all, I'm trying a very basic spark SQL job and apologies as I'm new to
a lot of this, but I'm getting this failure:
Exception in thread main java.lang.NoSuchMethodError:
org.apache.spark.sql.SchemaRDD.take(I)[Lorg/apache/spark/sql/catalyst/expressions/Row;
I've tried a variety of uber-jar
.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9735.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
The problem is resolved. Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688p9742.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-NoSuchMethodError-on-Spark-1-0-0-standalone-tp9746.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
!= )
val data = sqlc.jsonRDD(jrdd)
data.printSchema()
}
}
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Have you upgraded the cluster where you are running this 1.0.1 as
well? A NoSuchMethodError
almost always means that the class files available at runtime are different
from those that were there when you compiled your program.
On Mon, Jul 14, 2014 at 7:06 PM, SK skrishna...@gmail.com wrote
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p8953.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
a
bit of a scala newbie.)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p8953.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Try deleting the .iv2 directory in your home and then do a sbt clean
assembly would solve this issue i guess.
Thanks
Best Regards
On Thu, Jun 26, 2014 at 3:10 AM, Robert James srobertja...@gmail.com
wrote:
In case anyone else is having this problem, deleting all ivy's cache,
then doing a sbt
Hi, Robert --
I wonder if this is an instance of SPARK-2075:
https://issues.apache.org/jira/browse/SPARK-2075
-- Paul
—
p...@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
On Wed, Jun 25, 2014 at 6:28 AM, Robert James srobertja...@gmail.com
wrote:
On 6/24/14, Robert James
My app works fine under Spark 0.9. I just tried upgrading to Spark
1.0, by downloading the Spark distro to a dir, changing the sbt file,
and running sbt assembly, but I get now NoSuchMethodErrors when trying
to use spark-submit.
I copied in the SimpleApp example from
the project)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Upgrading-to-Spark-1-0-0-causes-NoSuchMethodError-tp8207p8220.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
On 6/24/14, Peng Cheng pc...@uow.edu.au wrote:
I got 'NoSuchFieldError' which is of the same type. its definitely a
dependency jar conflict. spark driver will load jars of itself which in
recent version get many dependencies that are 1-2 years old. And if your
newer version dependency is in
-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
JSON fan).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p7347.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
breeze_2.10-0.7.jar
almost all the jars about breeze i can find, but still
NoSuchMethodError:
breeze.linalg.DenseMatrix
from the executor stderr, you can see the executor successsully fetches
these jars, what's wrong
about my method? thank you!
14/05/14 20:36:02 INFO Executor: Fetching
finally i fixed it. previous failure is caused by lack of some jars.
i pasted the classpath in local mode to workers by using show
compile:dependencyClasspath
and it works!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg
jars to workers using sc.addJar()
breeze jars include :
breeze-natives_2.10-0.7.jar
breeze-macros_2.10-0.3.jar
breeze-macros_2.10-0.3.1.jar
breeze_2.10-0.8-SNAPSHOT.jar
breeze_2.10-0.7.jar
almost all the jars about breeze i can find, but still NoSuchMethodError
NoSuchMethodError:
breeze.linalg.DenseMatrix
from the executor stderr, you can see the executor successsully fetches
these jars, what's wrong
about my method? thank you!
14/05/14 20:36:02 INFO Executor: Fetching
http://192.168.0.106:42883/jars/breeze-natives_2.10-0.7.jar with timestamp
1400070957376
14/05
.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5355.html
opinion, everything needed is packaged to the jar file, isn't it?
and does anyone used breeze before? is it good for matrix operation?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
Sent from the Apache
$implOpMulMatrix_DMD_DMD_eq_DMD$;
in my opinion, everything needed is packaged to the jar file, isn't it?
and does anyone used breeze before? is it good for matrix operation?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix
to the jar file,
isn't it?
and does anyone used breeze before? is it good for matrix operation?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
Sent from the Apache Spark User List
/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5355.html
Sent from the Apache
i fixed it.
i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from
fixed it.
i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from
/;
is there something need to modify?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I am seeing the following exception from a very basic test project when it
runs on spark local.
java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
The project is built with Java 1.6, Scala 2.10.3 and spark 0.9.1
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2375.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2375.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-user-list.1001560.n3.nabble.com/NoSuchMethodError-Akka-Props-tp2191p2377.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
like
to develop downstream code in Java
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
days - as I am a Java Guy and would like
to develop downstream code in Java
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Please remove me from the mail list.
-邮件原件-
发件人: Deepak Nulu [mailto:deepakn...@gmail.com]
发送时间: 2014年3月7日 7:45
收件人: u...@spark.incubator.apache.org
主题: Re: NoSuchMethodError - Akka - Props
I see the same error. I am trying a standalone example integrated into a Play
Framework v2.2.2
81 matches
Mail list logo