As another, general question, are spark packages the go-to way of extending
spark functionality? In my specific use-case I would like to start spark
(be it spark-shell or other) and hook into the listener API.
Since I wasn't able to find much documentation about spark packages, I was
wondering if they are still actively being developed?

thanks,
--Jakob

On 10 November 2015 at 14:58, Jakob Odersky <joder...@gmail.com> wrote:

> (accidental keyboard-shortcut sent the message)
> ... spark-shell from the spark 1.5.2 binary distribution.
> Also, running "spPublishLocal" has the same effect.
>
> thanks,
> --Jakob
>
> On 10 November 2015 at 14:55, Jakob Odersky <joder...@gmail.com> wrote:
>
>> Hi,
>> I ran into in error trying to run spark-shell with an external package
>> that I built and published locally
>> using the spark-package sbt plugin (
>> https://github.com/databricks/sbt-spark-package).
>>
>> To my understanding, spark packages can be published simply as maven
>> artifacts, yet after running "publishLocal" in my package project (
>> https://github.com/jodersky/spark-paperui), the following command
>>
>>    park-shell --packages
>> ch.jodersky:spark-paperui-server_2.10:0.1-SNAPSHOT
>>
>> gives an error:
>>
>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>
>>         ::          UNRESOLVED DEPENDENCIES         ::
>>
>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>
>>         :: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
>> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
>> required from org.apache.spark#spark-submit-parent;1.0 default
>>
>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>
>>
>> :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>> Exception in thread "main" java.lang.RuntimeException: [unresolved
>> dependency: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
>> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
>> required from org.apache.spark#spark-submit-parent;1.0 default]
>>     at
>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1011)
>>     at
>> org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
>>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:12
>>
>> Do I need to include some default configuration? If so where and how
>> should I do it? All other packages I looked at had no such thing.
>>
>> Btw, I am using spark-shell from a
>>
>>
>

Reply via email to