This is very cool ! Thanks a lot for making this more accessible !
Best,
--
FG
On Wed, Dec 7, 2016 at 11:46 PM Jencir Lee <jenc...@uci.edu> wrote:
> Hello,
>
> We just published the Spectral LDA model on Spark Packages. It’s an
> alternative approach to the LDA modelli
Hello,
We just published the Spectral LDA model on Spark Packages. It’s an alternative
approach to the LDA modelling based on tensor decompositions. We first build
the 2nd, 3rd-moment tensors from empirical word counts, then orthogonalise them
and perform decomposition on the 3rd-moment tensor
Thanks for the info Burak, I will check the repo you mention, do you know
concretely what is the 'magic' that spark-packages need or if is there any
document with info about it ?
On Fri, Jul 15, 2016 at 10:12 PM, Luciano Resende <luckbr1...@gmail.com>
wrote:
>
> On Fri, Jul 15, 2016
t; One more question, is there a formal specification or documentation of
> what do
> you need to include in a spark-package (any special file, manifest, etc) ?
> I
> have not found any doc in the website.
>
> Thanks,
> Ismael
>
>
>
I was under the impression that sp
Hi Ismael and Jacek,
If you use Maven for building your applications, you may use the
spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool) to perform packaging.
It requires you to build your jar using maven first, and then does all the
extra magic that Spark
+1000
Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.
If such a guide existed for Maven I could use it for sbt easily too
Hello, I would like to know if there is an easy way to package a new
spark-package
with maven, I just found this repo, but I am not an sbt user.
https://github.com/databricks/sbt-spark-package
One more question, is there a formal specification or documentation of what
do
you need to include in a
Hello,
I have some problems in modifying the description of some of my packages
in spark-packages.com. However, I haven't been able to change anything.
I've written to the e-mail direction in charge of managing this page,
but I got no answer.
Any clue?
Thanks
spark-packages.org/api/submit-release;.
Hope this helps for your last question
On 16 October 2015 at 08:43, jeff saremi <jeffsar...@hotmail.com> wrote:
> I'm looking for any form of documentation on Spark Packages
> Specifically, what happens when one issues a command like the following:
>
.
This kafka consumer is around for a while in spark-packages (
http://spark-packages.org/package/dibbhatt/kafka-spark-consumer ) and I see
many people started using it , I am now thinking of contributing back to
Apache Spark core project so that it can get better support ,visibility and
adoption.
Few
Hi Spark devs,
Is it possible to add jcenter or bintray support for Spark packages?
I'm trying to add our artifact which is on jcenter
https://bintray.com/airbnb/aerosolve
but I noticed in Spark packages it only accepts Maven coordinates.
--
Yee Yang Li Hector
google.com/+HectorYee
for Spark packages?
I'm trying to add our artifact which is on jcenter
https://bintray.com/airbnb/aerosolve
but I noticed in Spark packages it only accepts Maven coordinates.
--
Yee Yang Li Hector
google.com/+HectorYee
Hey everyone,
I’m looking to develop a package for use with SparkR. This package would
include custom R and Scala code and I was wondering if anyone had any insight
into how I might be able to use the sbt-spark-package tool to publish something
that needs to include an R package as well as a
Yup netlib lgpl right now is activated through a profile...if we can reuse
the same idea then csparse can also be added to spark with a lgpl flag. But
again as Sean said its tricky. Better to keep it on spark packages for
users to try.
On May 24, 2015 1:36 AM, Sean Owen so...@cloudera.com wrote
Wendell pwend...@gmail.com wrote:
Yes - spark packages can include non ASF licenses.
On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
wrote:
Hi,
Is it possible to add GPL/LGPL code on spark packages or it must be
licensed
under Apache as well ?
I want to expose
Hi,
Is it possible to add GPL/LGPL code on spark packages or it must be
licensed under Apache as well ?
I want to expose Professor Tim Davis's LGPL library for sparse algebra and
ECOS GPL library through the package.
Thanks.
Deb
Yes - spark packages can include non ASF licenses.
On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com wrote:
Hi,
Is it possible to add GPL/LGPL code on spark packages or it must be licensed
under Apache as well ?
I want to expose Professor Tim Davis's LGPL library
I thought LGPL is okay but GPL is not okay for Apache project.
On Saturday, May 23, 2015, Patrick Wendell pwend...@gmail.com wrote:
Yes - spark packages can include non ASF licenses.
On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com
javascript:; wrote:
Hi
That's the nice thing about Spark packages. It is just a package index for
libraries and applications built on top of Spark and not part of the Spark
codebase, so it is not restricted to follow only ASF-compatible licenses.
On Sat, May 23, 2015 at 10:12 PM, DB Tsai dbt...@dbtsai.com wrote:
I
Dear Spark users and developers,
I’m happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install
Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install packages
for any version of Spark, and makes it easy for developers
when the page is back up?
Thanks!
Andrew
On Mon, Dec 22, 2014 at 12:37 PM, Xiangrui Meng men...@gmail.com wrote:
Dear Spark users and developers,
I'm happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages
and developers,
I’m happy to announce Spark Packages (http://spark-packages.org), a
community package index to track the growing number of open source
packages and libraries that work with Apache Spark. Spark Packages
makes it easy for users to find, discuss, rate, and install packages
for any
of the VP, Apache Brand Management or
designee.
The title on the packages website is A community index of packages for
Apache Spark. Furthermore, the footnote of the website reads Spark
Packages is a community site hosting modules that are not part of Apache
Spark.
I think there's nothing
for
Apache Spark. Furthermore, the footnote of the website reads Spark
Packages is a community site hosting modules that are not part of Apache
Spark.
I think there's nothing on there that would confuse a relevant consumer
about the source of software. It's pretty clear that the Spark Packages
25 matches
Mail list logo