Try removing libraryDependencies += "org.apache.kafka" %% "kafka" %
"1.6.0" compile.
I guess the internal dependencies are automatically pulled when you add
spark-streaming-kafka_2.10.
Also try changing the version to 1.6.1 or lower. Just to see if the links
are broken.
Regards,
Haroon Syed
On
Hi,
You can have a custom properties file with Map like entries Key, Value
pairs "URL"-> "IPaddress:port/user/" etc and put this file on HDFS or
any location where Spark can access. Read the file as RDD as Map and read
the values in program.
You can also broadcast this in program if you need
To add to Mich, I put the build.sb under the Myproject root folder :
MyProject/buildt.sbt
and the assembly.sbt is placed in the folder called "project" under the
MyProject folder:
MyProject/project/assembly.sbt
also the the first line in build.sbt is to import the assembly keys as
below:
import