Hi Jakob,
> As another, general question, are spark packages the go-to way of
extending spark functionality?

Definitely. There are ~150 Spark Packages out there in spark-packages.org.
I use a lot of them in every day Spark work.
The number of released packages have steadily increased rate over the last
few months.

> Since I wasn't able to find much documentation about spark packages, I
was wondering if they are still actively being developed?

I would love to work on the documentation. Some exist on spark-packages.org,
but there could be a lot more. If you have any specific questions, feel
free to submit them to me directly, and I'll incorporate them to a FAQ I'm
working on.

Regarding your initial problem: Unfortunately `spPublishLocal` is broken
due to ivy configuration mismatches between Spark and the sbt-spark-package
plugin. What you can do instead is:
```
$ sbt +spark-paperui-server/publishM2
$ spark-shell --packages ch.jodersky:spark-paperui-server_2.10:0.1-SNAPSHOT
```

Hopefully that should work for you.

Best,
Burak


On Wed, Nov 11, 2015 at 10:53 AM, Jakob Odersky <joder...@gmail.com> wrote:

> As another, general question, are spark packages the go-to way of
> extending spark functionality? In my specific use-case I would like to
> start spark (be it spark-shell or other) and hook into the listener API.
> Since I wasn't able to find much documentation about spark packages, I was
> wondering if they are still actively being developed?
>
> thanks,
> --Jakob
>
> On 10 November 2015 at 14:58, Jakob Odersky <joder...@gmail.com> wrote:
>
>> (accidental keyboard-shortcut sent the message)
>> ... spark-shell from the spark 1.5.2 binary distribution.
>> Also, running "spPublishLocal" has the same effect.
>>
>> thanks,
>> --Jakob
>>
>> On 10 November 2015 at 14:55, Jakob Odersky <joder...@gmail.com> wrote:
>>
>>> Hi,
>>> I ran into in error trying to run spark-shell with an external package
>>> that I built and published locally
>>> using the spark-package sbt plugin (
>>> https://github.com/databricks/sbt-spark-package).
>>>
>>> To my understanding, spark packages can be published simply as maven
>>> artifacts, yet after running "publishLocal" in my package project (
>>> https://github.com/jodersky/spark-paperui), the following command
>>>
>>>    park-shell --packages
>>> ch.jodersky:spark-paperui-server_2.10:0.1-SNAPSHOT
>>>
>>> gives an error:
>>>
>>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>>
>>>         ::          UNRESOLVED DEPENDENCIES         ::
>>>
>>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>>
>>>         :: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
>>> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
>>> required from org.apache.spark#spark-submit-parent;1.0 default
>>>
>>>         ::::::::::::::::::::::::::::::::::::::::::::::
>>>
>>>
>>> :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>> Exception in thread "main" java.lang.RuntimeException: [unresolved
>>> dependency: ch.jodersky#spark-paperui-server_2.10;0.1: configuration not
>>> found in ch.jodersky#spark-paperui-server_2.10;0.1: 'default'. It was
>>> required from org.apache.spark#spark-submit-parent;1.0 default]
>>>     at
>>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1011)
>>>     at
>>> org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
>>>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
>>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:12
>>>
>>> Do I need to include some default configuration? If so where and how
>>> should I do it? All other packages I looked at had no such thing.
>>>
>>> Btw, I am using spark-shell from a
>>>
>>>
>>
>

Reply via email to