Hi

I have integrated Spark Streaming with Kafka in which Im listening 2 topics

def main(args: Array[String]): Unit = {

    val schema = StructType(
      List(
        StructField("gatewayId", StringType, true),
        StructField("userId", StringType, true)
      )
    )

    val spark = SparkSession
      .builder
      .master("local[4]")
      .appName("DeviceAutomation")
      .getOrCreate()

    val dfStatus = spark.readStream.
      format("kafka").
      option("subscribe", "utility-status, utility-critical").
      option("kafka.bootstrap.servers", "localhost:9092").
      option("startingOffsets", "earliest")
      .load()
    
      
      }
      
Since I have few more topics to be listed and perform different operations I
would like to move each topics into separate case class for better clarity.
Is it possible? 



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to