Hello,

I have tried multiple different settings in build.sbt but seems like
nothing is working.
Can anyone suggest the right syntax/way to include kafka with spark?

Error
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/kafka/KafkaUtils$

build.sbt
libraryDependencies += "org.apache.hbase" % "hbase" % "0.92.1"
libraryDependencies += "org.apache.hadoop" % "hadoop-core" % "1.0.2"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.0.0"
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-streaming_2.10" % "1.5.2",
  "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.2",
  "org.apache.spark" %% "spark-streaming" % "1.5.2" % "provided",
  "org.apache.spark" %% "spark-streaming-kafka" % "1.5.2" % "provided"
)


Thanks,
Vinti

Reply via email to