Hi ,
Released latest version of Receiver based Kafka Consumer for Spark Streaming
.
Available at Spark Packages : https://spark-packages.org/package/dibbhatt/
kafka-spark-consumer
Also at github : https://github.com/dibbhatt/kafka-spark-consumer
Some key features
- Tuned for better
from my iPhone
>
> On Aug 25, 2016, at 6:33 AM, Dibyendu Bhattacharya <
> dibyendu.bhattach...@gmail.com> wrote:
>
> Hi ,
>
> Released latest version of Receiver based Kafka Consumer for Spark
> Streaming.
>
> Receiver is compatible with Kafka versions 0.8.x, 0.
om> wrote:
>
> Hi ,
>
> Released latest version of Receiver based Kafka Consumer for Spark Streaming.
>
> Receiver is compatible with Kafka versions 0.8.x, 0.9.x and 0.10.x and All
> Spark Versions
>
> Available at Spark Packages :
> https://spark-packages.org/packa
Hi ,
Released latest version of Receiver based Kafka Consumer for Spark
Streaming.
Receiver is compatible with Kafka versions 0.8.x, 0.9.x and 0.10.x and All
Spark Versions
Available at Spark Packages :
https://spark-packages.org/package/dibbhatt/kafka-spark-consumer
Also at github : https
]
*Sent:* Wednesday, November 05, 2014 2:28 PM
*To:* Shao, Saisai
*Cc:* user@spark.apache.org
*Subject:* Re: Kafka Consumer in Spark Streaming
The Kafka broker definitely has messages coming in. But your #2 point
is valid. Needless to say I am a newbie to Spark. I can't figure out
where
I've following code in my program. I don't get any error, but it's not
consuming the messages either. Shouldn't the following code print the line
in the 'call' method? What am I missing?
Please help. Thanks.
JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new
Consumer in Spark Streaming
I've following code in my program. I don't get any error, but it's not
consuming the messages either. Shouldn't the following code print the line in
the 'call' method? What am I missing?
Please help. Thanks.
JavaStreamingContext ssc = new JavaStreamingContext
when running the app, this will help
to define the problem.
Thanks
Jerry
*From:* Something Something [mailto:mailinglist...@gmail.com]
*Sent:* Wednesday, November 05, 2014 1:57 PM
*To:* user@spark.apache.org
*Subject:* Kafka Consumer in Spark Streaming
I've following code in my
this code only expresses a transformation and so does not actually
cause any action. I think you intend to use foreachRDD.
On Wed, Nov 5, 2014 at 5:57 AM, Something Something
mailinglist...@gmail.com wrote:
I've following code in my program. I don't get any error, but it's not
consuming the
To: Shao, Saisai
Cc: user@spark.apache.org
Subject: Re: Kafka Consumer in Spark Streaming
The Kafka broker definitely has messages coming in. But your #2 point is
valid. Needless to say I am a newbie to Spark. I can't figure out where the
'executor' logs would be. How would I find them?
All I see
PM
*To:* Shao, Saisai
*Cc:* user@spark.apache.org
*Subject:* Re: Kafka Consumer in Spark Streaming
The Kafka broker definitely has messages coming in. But your #2 point is
valid. Needless to say I am a newbie to Spark. I can't figure out where
the 'executor' logs would be. How would I
@spark.apache.org
Subject: Re: Kafka Consumer in Spark Streaming
Added foreach as follows. Still don't see any output on my console.
Would this go to the worker logs as Jerry indicated?
JavaPairReceiverInputDStream tweets = KafkaUtils.createStream(ssc,
mymachine:2181, 1, map
...@gmail.commailto:mailinglist...@gmail.com
Date: Wednesday, November 5, 2014 at 12:23 PM
To: Shao, Saisai saisai.s...@intel.commailto:saisai.s...@intel.com
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Kafka Consumer in Spark Streaming
Added foreach
,
Rahul
From: Something Something mailinglist...@gmail.com
Date: Wednesday, November 5, 2014 at 12:23 PM
To: Shao, Saisai saisai.s...@intel.com
Cc: user@spark.apache.org user@spark.apache.org
Subject: Re: Kafka Consumer in Spark Streaming
Added foreach as follows. Still don't see any output
14 matches
Mail list logo