)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-saving-kafka-DStream-into-hadoop-throws-exception-tp12202.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
Hi,
Just wondering if you've seen the following error when reading from Kafka:
ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting
receiver 0 - java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
at kafka.utils.Log4jController$.init(Log4jController.scala:29)
at
Looks like the kafka jar that you are using isn't compatible with your
scala version.
Thanks
Best Regards
On Wed, Oct 29, 2014 at 11:50 AM, Harold Nguyen har...@nexgate.com wrote:
Hi,
Just wondering if you've seen the following error when reading from Kafka:
ERROR ReceiverTracker:
Thanks! How do I find out which Kafka jar to use for scala 2.10.4?
—
Sent from Mailbox
On Wed, Oct 29, 2014 at 12:26 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Looks like the kafka jar that you are using isn't compatible with your
scala version.
Thanks
Best Regards
On Wed, Oct 29,
I using kafka_2.10-1.1.0.jar on spark 1.1.0
—
Sent from Mailbox
On Wed, Oct 29, 2014 at 12:31 AM, null har...@nexgate.com wrote:
Thanks! How do I find out which Kafka jar to use for scala 2.10.4?
—
Sent from Mailbox
On Wed, Oct 29, 2014 at 12:26 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Hi folks,
I am trying to use Kafka with Spark Streaming, and it appears I cannot do the
normal 'sbt package' as I do with other Spark applications, such as Spark
alone or Spark with MLlib. I learned I have to build with the sbt-assembly
plugin.
OK, so here is my build.sbt file
.
On Fri, Aug 29, 2014 at 1:30 PM, Aris wrote:
Hi folks,
I am trying to use Kafka with Spark Streaming, and it appears I cannot do
the normal 'sbt package' as I do with other Spark applications, such as
Spark alone or Spark with MLlib. I learned I have to build with the
sbt-assembly
Hi folks,
I am trying to use Kafka with Spark Streaming, and it appears I cannot do
the normal 'sbt package' as I do with other Spark applications, such as
Spark alone or Spark with MLlib. I learned I have to build with the
sbt-assembly plugin.
OK, so here is my build.sbt file for my extremely
)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-saving-kafka-DStream-into-hadoop-throws-exception-tp12202.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-saving-kafka-DStream-into-hadoop-throws-exception-tp12202.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
{
case e: Exception = println(exception caught: + e);
}
}
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-saving-kafka-DStream-into-hadoop-throws-exception-tp12202p12207.html
Sent from the Apache Spark User List mailing list
.1001560.n3.nabble.com/spark-streaming-saving-kafka-DStream-into-hadoop-throws-exception-tp12202p12213.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
.1001560.n3.nabble.com/Using-Spark-Streaming-with-Kafka-0-7-2-tp10674.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
myself but I
can't find any documentation specifically for building spark streaming.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Streaming-with-Kafka-0-7-2-tp10674.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
:
A simple
sbt assembly
is not working. Is there any other way to include particular jars with
assembly command?
Regards,
Dilip
On Friday 11 July 2014 12:45 PM, Bill Jay wrote:
I have met similar issues. The reason is probably because in Spark
assembly, spark-streaming-kafka
Alsom the reason the spark-streaming-kafka is not included in the spark
assembly is that we do not want dependencies of external systems like kafka
(which itself probably has a complex dependency tree) to cause conflict
with the core spark's functionality and stability.
TD
On Sun, Jul 13, 2014
Easiest fix would be adding the kafka jars to the SparkContext while
creating it.
Thanks
Best Regards
On Fri, Jul 11, 2014 at 4:39 AM, Dilip dilip_ram...@hotmail.com wrote:
Hi,
I am trying to run a program with spark streaming using Kafka on a stand
alone system. These are my details
am trying to run a program with spark streaming using Kafka on a
stand alone system. These are my details:
Spark 1.0.0 hadoop2
Scala 2.10.3
I am trying a simple program using my custom sbt project but this
is the error I am getting:
Exception in thread main
I have met similar issues. The reason is probably because in Spark
assembly, spark-streaming-kafka is not included. Currently, I am using
Maven to generate a shaded package with all the dependencies. You may try
to use sbt assembly to include the dependencies in your jar file.
Bill
On Thu, Jul
A simple
sbt assembly
is not working. Is there any other way to include particular jars with
assembly command?
Regards,
Dilip
On Friday 11 July 2014 12:45 PM, Bill Jay wrote:
I have met similar issues. The reason is probably because in Spark
assembly, spark-streaming-kafka is not included
assembly
is not working. Is there any other way to include particular jars with
assembly command?
Regards,
Dilip
On Friday 11 July 2014 12:45 PM, Bill Jay wrote:
I have met similar issues. The reason is probably because in Spark
assembly, spark-streaming-kafka is not included. Currently
Hi,
I am trying to run a program with spark streaming using Kafka on a stand
alone system. These are my details:
Spark 1.0.0 hadoop2
Scala 2.10.3
I am trying a simple program using my custom sbt project but this is the
error I am getting:
Exception in thread main
I closed that PR for other reasons. This change is still proposed by itself:
https://issues.apache.org/jira/browse/SPARK-2034
https://github.com/apache/spark/pull/980
On Fri, Jun 6, 2014 at 1:35 AM, Tobias Pfeiffer t...@preferred.jp wrote:
Sean,
your patch fixes the issue, thank you so much!
Hi,
I am trying to use Spark Streaming with Kafka, which works like a
charm -- except for shutdown. When I run my program with sbt
run-main, sbt will never exit, because there are two non-daemon
threads left that don't die.
I created a minimal example at
https://gist.github.com/tgpfeiffer
Sean,
your patch fixes the issue, thank you so much! (This is the second
time within one week I run into network libraries not shutting down
threads properly, I'm really glad your code fixes the issue.)
I saw your pull request is closed, but not merged yet. Can I do
anything to get your fix into
Hi All
I am using Spark Streaming with Kafka, I recieve messages and after minor
processing I write them to HDFS, as of now I am using saveAsTextFiles() /
saveAsHadoopFiles() Java methods
- Is there some default way of writing stream to Hadoop like we have HDFS sink
concept in Flume? I mean
201 - 226 of 226 matches
Mail list logo