You can have below statement for multiple topics val dfStatus = spark.readStream. format("kafka"). option("subscribe", "utility-status, utility-critical"). option("kafka.bootstrap.servers", "localhost:9092"). option("startingOffsets", "earliest") .load()
On Mon, Sep 17, 2018 at 3:28 AM sivaprakash <sivaprakashshanmu...@gmail.com> wrote: > Hi > > I have integrated Spark Streaming with Kafka in which Im listening 2 topics > > def main(args: Array[String]): Unit = { > > val schema = StructType( > List( > StructField("gatewayId", StringType, true), > StructField("userId", StringType, true) > ) > ) > > val spark = SparkSession > .builder > .master("local[4]") > .appName("DeviceAutomation") > .getOrCreate() > > val dfStatus = spark.readStream. > format("kafka"). > option("subscribe", "utility-status, utility-critical"). > option("kafka.bootstrap.servers", "localhost:9092"). > option("startingOffsets", "earliest") > .load() > > > } > > Since I have few more topics to be listed and perform different operations > I > would like to move each topics into separate case class for better clarity. > Is it possible? > > > > -- > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > -- Thanks, Naresh www.linkedin.com/in/naresh-dulam http://hadoopandspark.blogspot.com/