Hi ,
Is there any code to implement a kafka output for spark streaming? My use
case is all the output need to be dumped back to kafka cluster again after
data is processed ? What will be guideline to implement such function ? I
heard foreachRDD will create one instance of producer per batch ? If
Hi ,
It might be a very general question to ask here but I'm curious to know why
spark streaming can achieve better throughput than storm as claimed in the
spark streaming paper. Does it depend on certain use cases and/or data
source ? What drives better performance in spark streaming case or in
in spark kafka example,
it says
`./bin/run-example org.apache.spark.streaming.examples.KafkaWordCount
local[2] zoo01,zoo02,zoo03 my-consumer-group topic1,topic2 1`
can any one tell me what does local[2] represent ? i thought master url
should be sth like spark://hostname:portname .
also,
Hi I'm trying to run the kafka-word-count example in spark2.9.1. I
encountered some exception when initialize kafka consumer/producer config.
I'm using scala 2.10.3 and used maven build inside spark streaming kafka
library comes with spark2.9.1. Any one see this exception before?
Thanks,
using Spark 0.9.1 that uses scala 2.10.3 ?
TD
On Sat, May 3, 2014 at 6:16 PM, Weide Zhang weo...@gmail.com wrote:
Hi I'm trying to run the kafka-word-count example in spark2.9.1. I
encountered some exception when initialize kafka consumer/producer config.
I'm using scala 2.10.3 and used
, Weide Zhang weo...@gmail.com wrote:
Hi Tathagata,
I figured out the reason. I was adding a wrong kafka lib along side with
the version spark uses. Sorry for spamming.
Weide
On Sat, May 3, 2014 at 7:04 PM, Tathagata Das tathagata.das1...@gmail.com
wrote:
I am a little confused about
Hi I tried to build docker image for spark 0.9.1 but get the following
error.
any one has experience resolving the issue ?
The following packages have unmet dependencies:
tzdata-java : Depends: tzdata (= 2012b-1) but 2013g-0ubuntu0.12.04 is to
be installed
E: Unable to correct problems, you
, May 2, 2014 at 5:17 PM, Weide Zhang weo...@gmail.com wrote:
Hi I tried to build docker image for spark 0.9.1 but get the following
error.
any one has experience resolving the issue ?
The following packages have unmet dependencies:
tzdata-java : Depends: tzdata (= 2012b-1) but 2013g