On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía" <ieme...@gmail.com> wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>

I was under the impression that spark-packages was more like a place for
one to list/advertise their extensions,  but when you do spark submit with
--packages, it will use maven to resolve your package
and as long as it succeeds, it will use it (e.g. you can do mvn clean
install for your local packages, and use --packages with a spark server
running on that same machine).

>From sbt, I think you can just use publishTo and define a local repository,
something like

publishTo := Some("Local Maven Repository" at
"file://"+Path.userHome.absolutePath+"/.m2/repository")



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to