Are there any workarounds for this? Seems to be a dead end so far.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-sbt-pack-with-Spark-1-0-0-tp6649p11502.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Hi all!
We'be been using the sbt-pack sbt plugin
(https://github.com/xerial/sbt-pack) for building our standalone Spark
application for a while now. Until version 1.0.0, that worked nicely.
For those who don't know the sbt-pack plugin, it basically copies all the
dependencies JARs from your
One potential issue here is that mesos is using classifiers now to
publish there jars. It might be that sbt-pack has trouble with
dependencies that are published using classifiers. I'm pretty sure
mesos is the only dependency in Spark that is using classifiers, so
that's why I mention it.
On Sun,
https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L350
On Sun, Jun 1, 2014 at 11:03 AM, Patrick Wendell pwend...@gmail.com wrote:
One potential issue here is that mesos is using classifiers now to
publish there jars. It might be that sbt-pack has trouble with
dependencies
You're right Patrick!
Just had a chat with sbt-pack creator and indeed dependencies with classifiers
are ignored to avoid problems with dirty cache...
Should be fixed in next version of the plugin.
Cheers
Pierre
Message sent from a mobile device - excuse typos and abbreviations
Le 1