(accidental keyboard-shortcut sent the message)
... spark-shell from the spark 1.5.2 binary distribution.
Also, running "spPublishLocal" has the same effect.

thanks,
--Jakob

On 10 November 2015 at 14:55, Jakob Odersky <joder...@gmail.com> wrote:

> Hi,
> I ran into in error trying to run spark-shell with an external package
> that I built and published locally
> using the spark-package sbt plugin (
> https://github.com/databricks/sbt-spark-package).
>
> To my understanding, spark packages can be published simply as maven
> artifacts, yet after running "publishLocal" in my package project (
> https://github.com/jodersky/spark-paperui), the following command
>
>    park-shell --packages ch.jodersky:spark-paperui-server_2.10:0.1-SNAPSHOT
>
> gives an error:
>
>         ::::::::::::::::::::::::::::::::::::::::::::::
>
>         ::          UNRESOLVED DEPENDENCIES         ::
>
>         ::::::::::::::::::::::::::::::::::::::::::::::
>
>         :: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
> required from org.apache.spark#spark-submit-parent;1.0 default
>
>         ::::::::::::::::::::::::::::::::::::::::::::::
>
>
> :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> Exception in thread "main" java.lang.RuntimeException: [unresolved
> dependency: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
> required from org.apache.spark#spark-submit-parent;1.0 default]
>     at
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1011)
>     at
> org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:12
>
> Do I need to include some default configuration? If so where and how
> should I do it? All other packages I looked at had no such thing.
>
> Btw, I am using spark-shell from a
>
>

Reply via email to