Re: Publishing of the Spectral LDA model on Spark Packages

2016-12-08 Thread François Garillot
This is very cool ! Thanks a lot for making this more accessible ! Best, -- FG On Wed, Dec 7, 2016 at 11:46 PM Jencir Lee <jenc...@uci.edu> wrote: > Hello, > > We just published the Spectral LDA model on Spark Packages. It’s an > alternative approach to the LDA modelli

Publishing of the Spectral LDA model on Spark Packages

2016-12-07 Thread Jencir Lee
Hello, We just published the Spectral LDA model on Spark Packages. It’s an alternative approach to the LDA modelling based on tensor decompositions. We first build the 2nd, 3rd-moment tensors from empirical word counts, then orthogonalise them and perform decomposition on the 3rd-moment tensor

Re: spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Thanks for the info Burak, I will check the repo you mention, do you know concretely what is the 'magic' that spark-packages need or if is there any document with info about it ? On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende <luckbr1...@gmail.com> wrote: > > On Fri, Jul 15, 2016

Re: spark-packages with maven

2016-07-15 Thread Luciano Resende
t; One more question, is there a formal specification or documentation of > what do > you need to include in a spark-package (any special file, manifest, etc) ? > I > have not found any doc in the website. > > Thanks, > Ismael > > > I was under the impression that sp

Re: spark-packages with maven

2016-07-15 Thread Burak Yavuz
Hi Ismael and Jacek, If you use Maven for building your applications, you may use the spark-package command line tool ( https://github.com/databricks/spark-package-cmd-tool) to perform packaging. It requires you to build your jar using maven first, and then does all the extra magic that Spark

Re: spark-packages with maven

2016-07-15 Thread Jacek Laskowski
+1000 Thanks Ismael for bringing this up! I meant to have send it earlier too since I've been struggling with a sbt-based Scala project for a Spark package myself this week and haven't yet found out how to do local publishing. If such a guide existed for Maven I could use it for sbt easily too

spark-packages with maven

2016-07-15 Thread Ismaël Mejía
Hello, I would like to know if there is an easy way to package a new spark-package with maven, I just found this repo, but I am not an sbt user. https://github.com/databricks/sbt-spark-package One more question, is there a formal specification or documentation of what do you need to include in a

Modify text in spark-packages

2016-02-23 Thread Sergio Ramírez
Hello, I have some problems in modifying the description of some of my packages in spark-packages.com. However, I haven't been able to change anything. I've written to the e-mail direction in charge of managing this page, but I got no answer. Any clue? Thanks

Re: Insight into Spark Packages

2015-10-16 Thread Jakob Odersky
spark-packages.org/api/submit-release;. Hope this helps for your last question On 16 October 2015 at 08:43, jeff saremi <jeffsar...@hotmail.com> wrote: > I'm looking for any form of documentation on Spark Packages > Specifically, what happens when one issues a command like the following: >

Contributing Receiver based Low Level Kafka Consumer from Spark-Packages to Apache Spark Project

2015-10-14 Thread Dibyendu Bhattacharya
. This kafka consumer is around for a while in spark-packages ( http://spark-packages.org/package/dibbhatt/kafka-spark-consumer ) and I see many people started using it , I am now thinking of contributing back to Apache Spark core project so that it can get better support ,visibility and adoption. Few

Jcenter / bintray support for spark packages?

2015-06-10 Thread Hector Yee
Hi Spark devs, Is it possible to add jcenter or bintray support for Spark packages? I'm trying to add our artifact which is on jcenter https://bintray.com/airbnb/aerosolve but I noticed in Spark packages it only accepts Maven coordinates. -- Yee Yang Li Hector google.com/+HectorYee

Re: Jcenter / bintray support for spark packages?

2015-06-10 Thread Patrick Wendell
for Spark packages? I'm trying to add our artifact which is on jcenter https://bintray.com/airbnb/aerosolve but I noticed in Spark packages it only accepts Maven coordinates. -- Yee Yang Li Hector google.com/+HectorYee

Spark Packages: using sbt-spark-package tool with R

2015-06-04 Thread Chris Freeman
Hey everyone, I’m looking to develop a package for use with SparkR. This package would include custom R and Scala code and I was wondering if anyone had any insight into how I might be able to use the sbt-spark-package tool to publish something that needs to include an R package as well as a

Re: spark packages

2015-05-24 Thread Debasish Das
Yup netlib lgpl right now is activated through a profile...if we can reuse the same idea then csparse can also be added to spark with a lgpl flag. But again as Sean said its tricky. Better to keep it on spark packages for users to try. On May 24, 2015 1:36 AM, Sean Owen so...@cloudera.com wrote

Re: spark packages

2015-05-24 Thread Sean Owen
Wendell pwend...@gmail.com wrote: Yes - spark packages can include non ASF licenses. On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com wrote: Hi, Is it possible to add GPL/LGPL code on spark packages or it must be licensed under Apache as well ? I want to expose

spark packages

2015-05-23 Thread Debasish Das
Hi, Is it possible to add GPL/LGPL code on spark packages or it must be licensed under Apache as well ? I want to expose Professor Tim Davis's LGPL library for sparse algebra and ECOS GPL library through the package. Thanks. Deb

Re: spark packages

2015-05-23 Thread Patrick Wendell
Yes - spark packages can include non ASF licenses. On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com wrote: Hi, Is it possible to add GPL/LGPL code on spark packages or it must be licensed under Apache as well ? I want to expose Professor Tim Davis's LGPL library

Re: spark packages

2015-05-23 Thread DB Tsai
I thought LGPL is okay but GPL is not okay for Apache project. On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote: Yes - spark packages can include non ASF licenses. On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com javascript:; wrote: Hi

Re: spark packages

2015-05-23 Thread Reynold Xin
That's the nice thing about Spark packages. It is just a package index for libraries and applications built on top of Spark and not part of the Spark codebase, so it is not restricted to follow only ASF-compatible licenses. On Sat, May 23, 2015 at 10:12 PM, DB Tsai dbt...@dbtsai.com wrote: I

Announcing Spark Packages

2014-12-22 Thread Xiangrui Meng
Dear Spark users and developers, I’m happy to announce Spark Packages (http://spark-packages.org), a community package index to track the growing number of open source packages and libraries that work with Apache Spark. Spark Packages makes it easy for users to find, discuss, rate, and install

Re: Announcing Spark Packages

2014-12-22 Thread Andrew Ash
Packages (http://spark-packages.org), a community package index to track the growing number of open source packages and libraries that work with Apache Spark. Spark Packages makes it easy for users to find, discuss, rate, and install packages for any version of Spark, and makes it easy for developers

Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
when the page is back up? Thanks! Andrew On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng men...@gmail.com wrote: Dear Spark users and developers, I'm happy to announce Spark Packages (http://spark-packages.org), a community package index to track the growing number of open source packages

Re: Announcing Spark Packages

2014-12-22 Thread Hitesh Shah
and developers, I’m happy to announce Spark Packages (http://spark-packages.org), a community package index to track the growing number of open source packages and libraries that work with Apache Spark. Spark Packages makes it easy for users to find, discuss, rate, and install packages for any

Re: Announcing Spark Packages

2014-12-22 Thread Patrick Wendell
of the VP, Apache Brand Management or designee. The title on the packages website is A community index of packages for Apache Spark. Furthermore, the footnote of the website reads Spark Packages is a community site hosting modules that are not part of Apache Spark. I think there's nothing

Re: Announcing Spark Packages

2014-12-22 Thread Nicholas Chammas
for Apache Spark. Furthermore, the footnote of the website reads Spark Packages is a community site hosting modules that are not part of Apache Spark. I think there's nothing on there that would confuse a relevant consumer about the source of software. It's pretty clear that the Spark Packages