I am developing an application which calls into Spark MLlib I am working on
(LDA). To do so, I am linking Spark locally in the application and using sbt
assembly/publish-local in the spark directory.
When I run sbt assembly in my application I get the following error:
$ sbt assembly
[info] Loading global plugins from /Users/pedro/.sbt/0.13/plugins
[info] Loading project definition from
/Users/pedro/Documents/Code/nips-lda/project
[info] Set current project to nips-lda (in build
file:/Users/pedro/Documents/Code/nips-lda/)
[info] Updating {file:/Users/pedro/Documents/Code/nips-lda/}nips-lda...
[info] Resolving org.apache.spark#spark-network-common_2.10;1.3.0-SNAPSHOT ...
[warn] module not found:
org.apache.spark#spark-network-common_2.10;1.3.0-SNAPSHOT
[warn] ==== local: tried
[warn]
/Users/pedro/.ivy2/local/org.apache.spark/spark-network-common_2.10/1.3.0-SNAPSHOT/ivys/ivy.xml
[warn] ==== public: tried
[warn]
https://repo1.maven.org/maven2/org/apache/spark/spark-network-common_2.10/1.3.0-SNAPSHOT/spark-network-common_2.10-1.3.0-SNAPSHOT.pom
[warn] ==== Typesafe: tried
[warn]
http://repo.typesafe.com/typesafe/releases/org/apache/spark/spark-network-common_2.10/1.3.0-SNAPSHOT/spark-network-common_2.10-1.3.0-SNAPSHOT.pom
[warn] ==== Spray: tried
[warn]
http://repo.spray.cc/org/apache/spark/spark-network-common_2.10/1.3.0-SNAPSHOT/spark-network-common_2.10-1.3.0-SNAPSHOT.pom
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-network-common_2.10;1.3.0-SNAPSHOT: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.apache.spark:spark-network-common_2.10:1.3.0-SNAPSHOT
[warn] +- org.apache.spark:spark-network-shuffle_2.10:1.3.0-SNAPSHOT
[warn] +- org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
(/Users/pedro/Documents/Code/nips-lda/build.sbt#L15-26)
[warn] +- org.apache.spark:spark-catalyst_2.10:1.3.0-SNAPSHOT
[warn] +- org.apache.spark:spark-sql_2.10:1.3.0-SNAPSHOT
[warn] +- org.apache.spark:spark-mllib_2.10:1.3.0-SNAPSHOT
(/Users/pedro/Documents/Code/nips-lda/build.sbt#L15-26)
[warn] +- edu.berkeley.cs.amplab:nips-lda_2.10:0.1
sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-network-common_2.10;1.3.0-SNAPSHOT: not found
at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:243)
at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:158)
Inspecting the sbt publish-local logs and ~/.ivy2/local/org.apache.spark it
looks like spark-network-common is not getting published, which causes this
error in build time. I am rebased on upstream/master recently and it looks like
this was recently added. Inspecting the pom.xml in spark, it looks like
submodules look in order, same with project/SparkBuild.scala.
Any suggestions on how to fix this?
P.S. is javadoc compilation broken atm?
--
Pedro
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/sbt-publish-local-fails-missing-spark-network-common-tp9471.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.