As another, general question, are spark packages the go-to way of extending
spark functionality? In my specific use-case I would like to start spark
(be it spark-shell or other) and hook into the listener API.
Since I wasn't able to find much documentation about spark packages, I was
wondering if
Hi Jakob,
> As another, general question, are spark packages the go-to way of
extending spark functionality?
Definitely. There are ~150 Spark Packages out there in spark-packages.org.
I use a lot of them in every day Spark work.
The number of released packages have steadily increased rate over
(accidental keyboard-shortcut sent the message)
... spark-shell from the spark 1.5.2 binary distribution.
Also, running "spPublishLocal" has the same effect.
thanks,
--Jakob
On 10 November 2015 at 14:55, Jakob Odersky wrote:
> Hi,
> I ran into in error trying to run
Hi,
I ran into in error trying to run spark-shell with an external package that
I built and published locally
using the spark-package sbt plugin (
https://github.com/databricks/sbt-spark-package).
To my understanding, spark packages can be published simply as maven
artifacts, yet after running