Re: spark-packages with maven

2016-07-19 Thread Jakob Odersky
Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter

--jakob

On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía  wrote:
> Thanks for the info Burak, I will check the repo you mention, do you know
> concretely what is the 'magic' that spark-packages need or if is there any
> document with info about it ?
>
> On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende 
> wrote:
>>
>>
>> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:
>>>
>>> +1000
>>>
>>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>>> since I've been struggling with a sbt-based Scala project for a Spark
>>> package myself this week and haven't yet found out how to do local
>>> publishing.
>>>
>>> If such a guide existed for Maven I could use it for sbt easily too :-)
>>>
>>> Ping me Ismael if you don't hear back from the group so I feel invited
>>> for digging into the plugin's sources.
>>>
>>> Best,
>>> Jacek
>>>
>>>
>>> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>>>
>>> Hello, I would like to know if there is an easy way to package a new
>>> spark-package
>>> with maven, I just found this repo, but I am not an sbt user.
>>>
>>> https://github.com/databricks/sbt-spark-package
>>>
>>> One more question, is there a formal specification or documentation of
>>> what do
>>> you need to include in a spark-package (any special file, manifest, etc)
>>> ? I
>>> have not found any doc in the website.
>>>
>>> Thanks,
>>> Ismael
>>>
>>>
>>
>>
>> I was under the impression that spark-packages was more like a place for
>> one to list/advertise their extensions,  but when you do spark submit with
>> --packages, it will use maven to resolve your package
>> and as long as it succeeds, it will use it (e.g. you can do mvn clean
>> install for your local packages, and use --packages with a spark server
>> running on that same machine).
>>
>> From sbt, I think you can just use publishTo and define a local
>> repository, something like
>>
>> publishTo := Some("Local Maven Repository" at
>> "file://"+Path.userHome.absolutePath+"/.m2/repository")
>>
>>
>>
>> --
>> Luciano Resende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?

On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende 
wrote:

>
> On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:
>
>> +1000
>>
>> Thanks Ismael for bringing this up! I meant to have send it earlier too
>> since I've been struggling with a sbt-based Scala project for a Spark
>> package myself this week and haven't yet found out how to do local
>> publishing.
>>
>> If such a guide existed for Maven I could use it for sbt easily too :-)
>>
>> Ping me Ismael if you don't hear back from the group so I feel invited
>> for digging into the plugin's sources.
>>
>> Best,
>> Jacek
>>
>> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>>
>> Hello, I would like to know if there is an easy way to package a new
>> spark-package
>> with maven, I just found this repo, but I am not an sbt user.
>>
>> https://github.com/databricks/sbt-spark-package
>>
>> One more question, is there a formal specification or documentation of
>> what do
>> you need to include in a spark-package (any special file, manifest, etc)
>> ? I
>> have not found any doc in the website.
>>
>> Thanks,
>> Ismael
>>
>>
>>
>
> I was under the impression that spark-packages was more like a place for
> one to list/advertise their extensions,  but when you do spark submit with
> --packages, it will use maven to resolve your package
> and as long as it succeeds, it will use it (e.g. you can do mvn clean
> install for your local packages, and use --packages with a spark server
> running on that same machine).
>
> From sbt, I think you can just use publishTo and define a local
> repository, something like
>
> publishTo := Some("Local Maven Repository" at 
> "file://"+Path.userHome.absolutePath+"/.m2/repository")
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: spark-packages with maven

2016-07-15 Thread Luciano Resende
On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>

I was under the impression that spark-packages was more like a place for
one to list/advertise their extensions,  but when you do spark submit with
--packages, it will use maven to resolve your package
and as long as it succeeds, it will use it (e.g. you can do mvn clean
install for your local packages, and use --packages with a spark server
running on that same machine).

>From sbt, I think you can just use publishTo and define a local repository,
something like

publishTo := Some("Local Maven Repository" at
"file://"+Path.userHome.absolutePath+"/.m2/repository")



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: spark-packages with maven

2016-07-15 Thread Burak Yavuz
Hi Ismael and Jacek,

If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark Package requires.

Please contact me directly if you have any issues.

Best,
Burak

On Fri, Jul 15, 2016 at 10:48 AM, Jacek Laskowski  wrote:

> +1000
>
> Thanks Ismael for bringing this up! I meant to have send it earlier too
> since I've been struggling with a sbt-based Scala project for a Spark
> package myself this week and haven't yet found out how to do local
> publishing.
>
> If such a guide existed for Maven I could use it for sbt easily too :-)
>
> Ping me Ismael if you don't hear back from the group so I feel invited for
> digging into the plugin's sources.
>
> Best,
> Jacek
>
> On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:
>
> Hello, I would like to know if there is an easy way to package a new
> spark-package
> with maven, I just found this repo, but I am not an sbt user.
>
> https://github.com/databricks/sbt-spark-package
>
> One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>


Re: spark-packages with maven

2016-07-15 Thread Jacek Laskowski
+1000

Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.

If such a guide existed for Maven I could use it for sbt easily too :-)

Ping me Ismael if you don't hear back from the group so I feel invited for
digging into the plugin's sources.

Best,
Jacek

On 15 Jul 2016 2:29 p.m., "Ismaël Mejía"  wrote:

Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.

https://github.com/databricks/sbt-spark-package

One more question, is there a formal specification or documentation of what
do
you need to include in a spark-package (any special file, manifest, etc) ? I
have not found any doc in the website.

Thanks,
Ismael


spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.

https://github.com/databricks/sbt-spark-package

One more question, is there a formal specification or documentation of what
do
you need to include in a spark-package (any special file, manifest, etc) ? I
have not found any doc in the website.

Thanks,
Ismael