Hi all,
    My project uses spark-streaming-kafka module.When I migrate spark from
1.6.0 to 2.0.0 and rebuild project, I run into below error:

[warn]     module not found: org.apache.spark#spark-streaming-kafka_2.11;2.0.0
[warn] ==== local: tried
[warn]   
/home/linker/.ivy2/local/org.apache.spark/spark-streaming-kafka_2.11/2.0.0/ivys/ivy.xml
[warn] ==== public: tried
[warn]   
https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-kafka_2.11/2.0.0/spark-streaming-kafka_2.11-2.0.0.pom
[warn] ==== Akka Repository: tried
[warn]   
http://repo.akka.io/releases/org/apache/spark/spark-streaming-kafka_2.11/2.0.0/spark-streaming-kafka_2.11-2.0.0.pom
[warn] ==== sonatype-public: tried
[warn]   
https://oss.sonatype.org/content/repositories/public/org/apache/spark/spark-streaming-kafka_2.11/2.0.0/spark-streaming-kafka_2.11-2.0.0.pom
[info] Resolving jline#jline;2.12.1 ...
[warn]     ::::::::::::::::::::::::::::::::::::::::::::::
[warn]     ::          UNRESOLVED DEPENDENCIES         ::
[warn]     ::::::::::::::::::::::::::::::::::::::::::::::
[warn]     :: org.apache.spark#spark-streaming-kafka_2.11;2.0.0: not found
[warn]     ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]     Note: Unresolved dependencies path:
[warn]         org.apache.spark:spark-streaming-kafka_2.11:2.0.0
(/home/linker/workspace/linkerwp/linkerStreaming/build.sbt#L12-23)
sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-streaming-kafka_2.11;2.0.0: not found
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:313)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:191)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:168)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:156)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:156)
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:133)
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
    at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
    at 
xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
    at sbt.IvySbt.withIvy(Ivy.scala:128)
    at sbt.IvySbt.withIvy(Ivy.scala:125)
    at sbt.IvySbt$Module.withModule(Ivy.scala:156)
    at sbt.IvyActions$.updateEither(IvyActions.scala:168)
    at 
sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1442)
    at 
sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1438)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1473)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1471)
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1476)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1470)
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60)
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1493)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1420)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1372)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:237)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
    at 
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[error] (*:update) sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-streaming-kafka_2.11;2.0.0: not found
[error] Total time: 8 s, completed Dec 13, 2016 3:16:37 PM

​

My dependencies in built.sbt is :

 libraryDependencies ++= Seq(
   // Spark dependency
   "com.eaio.uuid" % "uuid" % "3.2",
   "org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
   "org.apache.spark" %% "spark-sql" % "2.0.0" % "provided",
   "org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
   "org.apache.spark" %% "spark-streaming-kafka" % "2.0.0",
   "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-M2",
    // Third-party libraries
   "com.github.scopt" %% "scopt" % "3.4.0"
 )

​

I want to know what's wrong. If spark-streaming-kafka does not support
spark 2.x?

-- 
Thanks & Best Regards
卢文泉 | Adolph Lu
TEL:+86 15651006559
Linker Networks(http://www.linkernetworks.com/)

Reply via email to