could
use a naive implementation that creates a new connection for
every RDD from the DStream in 4.3.1. This resulted in the
ClassNotFoundException described in [1], so I switched to 4.4.0.
Unfortunately the saveToPhoenix method is only available in Scala. So I did
find the suggestion to try
that creates a new connection for
every RDD from the DStream in 4.3.1. This resulted in the
ClassNotFoundException described in [1], so I switched to 4.4.0.
Unfortunately the saveToPhoenix method is only available in Scala. So I did
find the suggestion to try it via the saveAsNewHadoopApiFile
is in the jar which was in the classpath ?
Can you tell us a bit more about Schema$MyRow ?
On Fri, May 1, 2015 at 8:05 AM, Akshat Aranya aara...@gmail.com wrote:
Hi,
I'm getting a ClassNotFoundException at the executor when trying to
register a class for Kryo serialization
Hi,
I'm getting a ClassNotFoundException at the executor when trying to
register a class for Kryo serialization:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method
a ClassNotFoundException at the executor when trying to
register a class for Kryo serialization:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance
about Schema$MyRow ?
On Fri, May 1, 2015 at 8:05 AM, Akshat Aranya aara...@gmail.com wrote:
Hi,
I'm getting a ClassNotFoundException at the executor when trying to
register a class for Kryo serialization:
java.lang.reflect.InvocationTargetException
by: java.lang.ClassNotFoundException: com.example.Schema$MyRow
So the above class is in the jar which was in the classpath ?
Can you tell us a bit more about Schema$MyRow ?
On Fri, May 1, 2015 at 8:05 AM, Akshat Aranya aara...@gmail.com wrote:
Hi,
I'm getting a ClassNotFoundException
#org.apache.spark.sql.UserDefinedFunction
meaning
val predict = udf((score: Double) = if (score 0.5) true else false)
df.select( predict(df(score)) )
All compiles just fine but when I run it, I get a ClassNotFoundException
(see more details below)
I am sure that I load the data correctly and that I have a field called
score
) true else false)
df.select( predict(df(score)) )
All compiles just fine but when I run it, I get a ClassNotFoundException
(see more details below)
I am sure that I load the data correctly and that I have a field called
score with the correct data type.
Do I need to do anything else like registering
Hi Kevin,
yes I can test it means I have to build Spark from git repository?
Ralph
Am 17.03.15 um 02:59 schrieb Kevin (Sangwoo) Kim:
Hi Ralph,
It seems like https://issues.apache.org/jira/browse/SPARK-6299 issue,
which is I'm working on.
I submitted a PR for it, would you test it?
] on a remote Spark server but I
get a ClassNotFoundException.
When I run it local it works but not remote.
I added the spark-core lib as dependency. Do I need more?
Any ideas?
Thanks Ralph
[1] ...
https://github.com/apache/spark/blob/master/examples/
src/main/java/org/apache/spark/examples
Hi,
I want to try the JavaSparkPi example[1] on a remote Spark server but I
get a ClassNotFoundException.
When I run it local it works but not remote.
I added the spark-core lib as dependency. Do I need more?
Any ideas?
Thanks Ralph
[1] ...
https://github.com/apache/spark/blob/master
I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the
symptoms in this stackoverflow question.
http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11
Has anyone experienced anything similar?
Thank you!
/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11
Has anyone experienced anything similar?
Thank you!
Already come up several times today:
https://issues.apache.org/jira/browse/SPARK-5557
On Tue, Feb 3, 2015 at 8:04 AM, Night Wolf nightwolf...@gmail.com wrote:
Hi,
I just built Spark 1.3 master using maven via make-distribution.sh;
./make-distribution.sh --name mapr3 --skip-java-test --tgz
Hi,
I just built Spark 1.3 master using maven via make-distribution.sh;
./make-distribution.sh --name mapr3 --skip-java-test --tgz -Pmapr3 -Phive
-Phive-thriftserver -Phive-0.12.0
When trying to start the standalone spark master on a cluster I get the
following stack trace;
15/02/04 08:53:56
Here is the relevant snippet of code in my main program:
===
sparkConf.set(spark.serializer,
org.apache.spark.serializer.KryoSerializer)
sparkConf.set(spark.kryo.registrationRequired, true)
val summaryDataClass = classOf[SummaryData]
val summaryViewClass
Thanks for the notification!
For now, I'll use the Kryo serializer without registering classes until the
bug fix has been merged into the next version of Spark (I guess that will
be 1.3, right?).
arun
On Sun, Feb 1, 2015 at 10:58 PM, Shixiong Zhu zsxw...@gmail.com wrote:
It's a bug that has
It's a bug that has been fixed in https://github.com/apache/spark/pull/4258
but not yet been merged.
Best Regards,
Shixiong Zhu
2015-02-02 10:08 GMT+08:00 Arun Lists lists.a...@gmail.com:
Here is the relevant snippet of code in my main program:
===
that to the spark.executor.extraClassPath and
spark.driver.extraClassPath: no luck either.
At this stage I just think that the uber-jar and classpath are OK. I have no
more clues of what can be happening. Maybe some classloader issue with Spark
SQL?
The ClassNotFoundException occurs when returning data back to the driver
just think that the uber-jar and classpath are OK. I have no
more clues of what can be happening. Maybe some classloader issue with Spark
SQL?
The ClassNotFoundException occurs when returning data back to the driver
(because of the ResultTask seen in the stacktrace).
Does anyone had
, then running
simple test fails on the ClassNotFoundException (even if there is only one node
which hosts both the master and the worker).
If I run the workers and masters from the local drive
(c:\source\spark-1.1.0-bin-hadoop2.4), then the simple test runs ok (with one
or two nodes)
I haven’t
-3.2.2.jar
Executor classpath
is:/S:/spark-1.1.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.1.jar
Executor classpath is:/S:/spark/simple/
Would you have any idea how I could investigate further ?
Thanks !
Benoit.
PS: I could attach a debugger to the Worker where the ClassNotFoundException
happens
the
ClassNotFoundException happens but it is a bit painful
This message and the information contained herein is proprietary and
confidential and subject to the Amdocs policy statement, you may review at
http://www.amdocs.com/email_disclaimer.asp
--
If you reply to this email, your
:/spark-1.1.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.1.jar
Executor classpath is:/S:/spark/simple/
Would you have any idea how I could investigate further ?
Thanks !
Benoit.
PS: I could attach a debugger to the Worker where the
ClassNotFoundException happens but it is a bit
HI
I am using Spark 1.1.0 config with STANDALONE clusterManager and
CLUSTER deployMode. The logic is I want to submit multi jars with
spark-submit , using the �C-jars optional, I got an ClassNotFoundException
, by the way in my code I also use thread context class loader to load
Hi,
I'm having problems with a ClassNotFoundException using this simple example:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.net.URLClassLoader
import scala.util.Marshal
class ClassToRoundTrip(val id: Int) extends
Hi,
Yes, the error still occurs when we replace the lambdas with named
functions:
(same error traces as in previous posts)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-line11-read-when-loading-an-HDFS-text-file-with-SparkQL
/ClassNotFoundException-line11-read-when-loading-an-HDFS-text-file-with-SparkQL-in-spark-shell-tp9954.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Note that runnning a simple map+reduce job on the same hdfs files with the
same installation works fine:
Did you call collect() on the totalLength? Otherwise nothing has actually
executed.
Oh, I'm sorry... reduce is also an operation
On Wed, Jul 16, 2014 at 3:37 PM, Michael Armbrust mich...@databricks.com
wrote:
Note that runnning a simple map+reduce job on the same hdfs files with the
same installation works fine:
Did you call collect() on the totalLength? Otherwise
Hi Michael,
Thanks for your reply. Yes, the reduce triggered the actual execution, I got
a total length (totalLength: 95068762, for the record).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-line11-read-when-loading-an-HDFS
svend.vanderve...@gmail.com wrote:
Hi Michael,
Thanks for your reply. Yes, the reduce triggered the actual execution, I
got
a total length (totalLength: 95068762, for the record).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-line11
/Failing-to-run-standalone-streaming-app-IOException-classNotFoundException-and-more-tp7632.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-streaming-ClassNotFoundException-org-apache-spark-streaming-kafka-KafkaReceiver-tp7045p7216.html
To start a new topic under Apache Spark User List, email
ml-node+s1001560n1...@n3.nabble.com
To unsubscribe from Apache Spark User List, click here
http://apache-spark-user-list.1001560.n3.nabble.com
Gaurav,
I am not sure that the * expands to what you expect it to do.
Normally the bash expands * to a space-separated string, not
colon-separated. Try specifying all the jars manually, maybe?
Tobias
On Thu, Jun 5, 2014 at 6:45 PM, Gaurav Dasgupta gaurav.d...@gmail.com wrote:
Hi,
I have
Hi,
I have written my own custom Spark streaming code which connects to Kafka
server and fetch data. I have tested the code on local mode and it is
working fine. But when I am executing the same code on YARN mode, I am
getting KafkaReceiver class not found exception. I am providing the Spark
Hi,
I have set up a cluster with Mesos (backed by Zookeeper) with three
master and three slave instances. I set up Spark (git HEAD) for use
with Mesos according to this manual:
http://people.apache.org/~pwendell/catalyst-docs/running-on-mesos.html
Using the spark-shell, I can connect to this
Hi Tobias,
Regarding my comment on closure serialization:
I was discussing it with my fellow Sparkers here and I totally overlooked
the fact that you need the class files to de-serialize the closures (or
whatever) on the workers, so you always need the jar file delivered to the
workers in order
Hi Tobias,
On Wed, May 21, 2014 at 5:45 PM, Tobias Pfeiffer t...@preferred.jp wrote:
first, thanks for your explanations regarding the jar files!
No prob :-)
On Thu, May 22, 2014 at 12:32 AM, Gerard Maas gerard.m...@gmail.com
wrote:
I was discussing it with my fellow Sparkers here and I
Here's the 1.0.0rc9 version of the docs:
https://people.apache.org/~pwendell/spark-1.0.0-rc9-docs/running-on-mesos.html
I refreshed them with the goal of steering users more towards prebuilt
packages than relying on compiling from source plus improving overall
formatting and clarity, but not
Hi Andrew,
Thanks for the current doc.
I'd almost gotten to the point where I thought that my custom code needed
to be included in the SPARK_EXECUTOR_URI but that can't possibly be
correct. The Spark workers that are launched on Mesos slaves should start
with the Spark core jars and then
I just ran into the same problem. I will respond if I find how to fix.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-tp5182p5342.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
://apache-spark-user-list.1001560.n3.nabble.com/spark-0-9-1-ClassNotFoundException-tp5256.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
with
respect to SPARK_HOME, in spark0.9.1 so that the classes can be found?
thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-0-9-1-ClassNotFoundException-tp5256.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-tp5182.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Just for curiosity , as you are using Cloudera-Manager hadoop and spark..
How you build shark .for it??
are you able to read any file from hdfs ...did you tried that out..???
Regards,
Arpit Tak
On Thu, Apr 17, 2014 at 7:07 PM, ge ko koenig@gmail.com wrote:
Hi,
the error
101 - 147 of 147 matches
Mail list logo