[ 
https://issues.apache.org/jira/browse/SPARK-2388?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14258597#comment-14258597
 ] 

Tathagata Das edited comment on SPARK-2388 at 12/25/14 12:26 AM:
-----------------------------------------------------------------

You can always create multiple kafka streams with different topics assigned to 
them and then transform (map, filter, etc.) them differently. I am closing this 
issue.


was (Author: tdas):
You can always create multiple kafka streams with different topics assigned to 
them and then transform (map, filter, etc.) them differently. 

> Streaming from multiple different Kafka topics is problematic
> -------------------------------------------------------------
>
>                 Key: SPARK-2388
>                 URL: https://issues.apache.org/jira/browse/SPARK-2388
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.0.0
>            Reporter: Sergey
>             Fix For: 1.0.1
>
>
> Default way of creating stream out of Kafka source would be as
>     val stream = KafkaUtils.createStream(ssc,"localhost:2181","logs", 
> Map("retarget" -> 2,"datapair" -> 2))
> However, if two topics - in this case "retarget" and "datapair" - are very 
> different, there is no way to set up different filter, mapping functions, 
> etc), as they are effectively merged.
> However, instance of KafkaInputDStream, created with this call internally 
> calls ConsumerConnector.createMessageStream() which returns *map* of 
> KafkaStreams, keyed by topic. It would be great if this map would be exposed 
> somehow, so aforementioned call 
>     val streamS = KafkaUtils.createStreamS(...)
> returned map of streams.
> Regards,
> Sergey Malov
> Collective Media



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to