Hi,

For testing you could also just use the Kafka 0.7.2 console consumer and
pipe it's output to netcat (nc) and process that as in the example

https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/NetworkWordCount.scala

That worked for me. Backporting to the older Kafka version seems tricky
due to all the protocol changes.

Andre

On 07/26/2014 12:56 AM, Tathagata Das wrote:
> Spark Streaming is built as part of the whole Spark repository. Hence
> follow Spark's building instructions
> <http://spark.apache.org/docs/latest/building-with-maven.html> to build
> Spark Streaming along with Spark.
> Spark Streaming 0.8.1 was built with kafka 0.7.2. You can take a look. If
> necessary, I recommend modifying the current Kafka Receiver based on the
> 0.8.1 Kafka Receiver
> <https://github.com/apache/spark/blob/v0.8.1-incubating/streaming/src/main/scala/org/apache/spark/streaming/dstream/KafkaInputDStream.scala>
> 
> TD
> 
> 
> On Fri, Jul 25, 2014 at 10:16 AM, maddenpj <madde...@gmail.com> wrote:
> 
>> Hi all,
>>
>> Currently we have Kafka 0.7.2 running in production and can't upgrade for
>> external reasons however spark streaming (1.0.1) was built with Kafka
>> 0.8.0.
>> What is the best way to use spark streaming with older versions of Kafka.
>> Currently I'm investigating trying to build spark streaming myself but I
>> can't find any documentation specifically for building spark streaming.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Streaming-with-Kafka-0-7-2-tp10674.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
> 

Reply via email to