Hello Ashok, 



We're consuming from more than 10 topics in some Spark streaming applications. 
Topic management is a concern (what is read from where, etc), but I have seen 
no issues from Spark itself. 




Regards, 




Bryan Jeffrey 




Get Outlook for Android







On Mon, Jun 19, 2017 at 3:24 PM -0400, "Ashok Kumar" 
<ashok34...@yahoo.com.invalid> wrote:










thank you
in the following example
   val topics = "test1,test2,test3"
    val brokers = "localhost:9092"
    val topicsSet = topics.split(",").toSet
    val sparkConf = new 
SparkConf().setAppName("KafkaDroneCalc").setMaster("local") 
//spark://localhost:7077
    val sc = new SparkContext(sparkConf)
    val ssc = new StreamingContext(sc, Seconds(30))
    val kafkaParams = Map[String, String]("metadata.broker.list" -> brokers)
    val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, 
StringDecoder] (ssc, kafkaParams, topicsSet)
  it is possible to have three topics or many topics?

 

    On Monday, 19 June 2017, 20:10, Michael Armbrust <mich...@databricks.com> 
wrote:
  

 I don't think that there is really a Spark specific limit here.  It would be a 
function of the size of your spark / kafka clusters and the type of processing 
you are trying to do.
On Mon, Jun 19, 2017 at 12:00 PM, Ashok Kumar <ashok34...@yahoo.com.invalid> 
wrote:
  Hi Gurus,
Within one Spark streaming process how many topics can be handled? I have not 
tried more than one topic.
Thanks


     




Reply via email to