I am invoking it from the java application by creating the sparkcontext
On Tue, Jun 23, 2015 at 12:17 PM, Tathagata Das t...@databricks.com wrote:
How are you adding that to the classpath? Through spark-submit or
otherwise?
On Mon, Jun 22, 2015 at 5:02 PM, Murthy Chelankuri
How are you adding that to the classpath? Through spark-submit or otherwise?
On Mon, Jun 22, 2015 at 5:02 PM, Murthy Chelankuri kmurt...@gmail.com
wrote:
Yes I have the producer in the class path. And I am using in standalone
mode.
Sent from my iPhone
On 23-Jun-2015, at 3:31 am, Tathagata
So you have Kafka in your classpath in you Java application, where you are
creating the sparkContext with the spark standalone master URL, right?
The recommended way of submitting spark applications to any cluster is
using spark-submit. See
yes , in spark standalone mode witht the master URL.
Jar are copying to execeutor and the application is running fine but its
failing at some point when kafka is trying to load the classes using some
reflection mechanisims for loading the Encoder and Partitioner classes.
Here are my finding so
This could be because of some subtle change in the classloaders used by
executors. I think there has been issues in the past with libraries that
use Class.forName to find classes by reflection. Because the executors load
classes dynamically using custom class loaders, libraries that use
I am using spark streaming. what i am trying to do is sending few messages
to some kafka topic. where its failing.
java.lang.ClassNotFoundException: com.abc.mq.msg.ObjectEncoder
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at
I have been using the spark from the last 6 months with the version 1.2.0.
I am trying to migrate to the 1.3.0 but the same problem i have written is
not wokring.
Its giving class not found error when i try to load some dependent jars
from the main program.
This use to work in 1.2.0 when set
Yes.
Thanks
Best Regards
On Mon, Jun 22, 2015 at 8:33 PM, Murthy Chelankuri kmurt...@gmail.com
wrote:
I have more than one jar. can we set sc.addJar multiple times with each
dependent jar ?
On Mon, Jun 22, 2015 at 8:30 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Try sc.addJar instead
Yes I have the producer in the class path. And I am using in standalone mode.
Sent from my iPhone
On 23-Jun-2015, at 3:31 am, Tathagata Das t...@databricks.com wrote:
Do you have Kafka producer in your classpath? If so how are adding that
library? Are you running on YARN, or Mesos or
Do you have Kafka producer in your classpath? If so how are adding that
library? Are you running on YARN, or Mesos or Standalone or local. These
details will be very useful.
On Mon, Jun 22, 2015 at 8:34 AM, Murthy Chelankuri kmurt...@gmail.com
wrote:
I am using spark streaming. what i am trying
10 matches
Mail list logo