On Wed, Sep 24, 2014 at 9:15 PM, Saikat Kanjilal <sxk1...@hotmail.com>
wrote:

> Shannon/Dmitry,Quick question, I'm wanting to calculate the scala
> equivalent of the frobenius norm per this API spec in python (
> http://docs.scipy.org/doc/numpy/reference/generated/numpy.linalg.norm.html),
> I dug into the mahout-math-scala project and found the following API to
> calculate the norm:
>
>
>
>
>
>
>
>
> def norm = sqrt(m.aggregate(Functions.PLUS, Functions.SQUARE))
> I believe the above is also calculating the frobenius norm, however I am
> curious why we are calling a Java API from scala, the type of m above is a
> java interface called Matrix, I'm guessing the implementation of aggregate
> is happening in the math-math-scala somewhere, is that assumption correct?
>

We are colling Colt (i.e. java) for pretty much everything. As far as scala
bindings are concerned, they are but a DSL wrapper to Colt (unlike
distributed algebra which is much more).

Aggregate is Colt's thing. Colt (aka Mahout-math) establish java-side
concept of different function types which are unfortunately not compatible
with Scala literals.




> Thanks in advance.
> > From: sxk1...@hotmail.com
> > To: dev@mahout.apache.org
> > Subject: RE: Mahout-1539-computation of gaussian kernel between 2 arrays
> of shapes
> > Date: Thu, 18 Sep 2014 12:51:36 -0700
> >
> > Ok great I'll use the cartesian spark API call, so what I'd still like
> some thoughts on where the code that calls the cartesian should live in our
> directory structure.
> > > Date: Thu, 18 Sep 2014 15:33:59 -0400
> > > From: squ...@gatech.edu
> > > To: dev@mahout.apache.org
> > > Subject: Re: Mahout-1539-computation of gaussian kernel between 2
> arrays of shapes
> > >
> > > Saikat,
> > >
> > > Spark has the cartesian() method that will align all pairs of points;
> > > that's the nontrivial part of determining an RBF kernel. After that
> it's
> > > a simple matter of performing the equation that's given on the
> > > scikit-learn doc page.
> > >
> > > However, like you said it'll also have to be implemented using the
> > > Mahout DSL. I can envision that users would like to compute pairwise
> > > metrics for a lot more than just RBF kernels (pairwise Euclidean
> > > distance, etc), so my guess would be a DSL implementation of
> cartesian()
> > > is what you're looking for. You can build the other methods on top of
> that.
> > >
> > > Correct me if I'm wrong.
> > >
> > > Shannon
> > >
> > > On 9/18/14, 3:28 PM, Saikat Kanjilal wrote:
> > > >
> http://scikit-learn.org/stable/modules/generated/sklearn.metrics.pairwise.rbf_kernel.html
> > > > I need to implement the above in the scala world and expose a DSL
> API to call the computation when computing the affinity matrix.
> > > >
> > > >> From: ted.dunn...@gmail.com
> > > >> Date: Thu, 18 Sep 2014 10:04:34 -0700
> > > >> Subject: Re: Mahout-1539-computation of gaussian kernel between 2
> arrays of shapes
> > > >> To: dev@mahout.apache.org
> > > >>
> > > >> There are number of non-traditional linear algebra operations like
> this
> > > >> that are important to implement.
> > > >>
> > > >> Can you describe what you intend to do so that we can discuss the
> shape of
> > > >> the API and computation?
> > > >>
> > > >>
> > > >>
> > > >> On Wed, Sep 17, 2014 at 9:28 PM, Saikat Kanjilal <
> sxk1...@hotmail.com>
> > > >> wrote:
> > > >>
> > > >>> Dmitry et al,As part of the above JIRA I need to calculate the
> gaussian
> > > >>> kernel between 2 shapes, I looked through mahout-math-scala and
> didnt see
> > > >>> anything to do this, any objections to me adding some code under
> > > >>> scalabindings to do this?
> > > >>> Thanks in advance.
> > > >
> > >
> >
>
>

Reply via email to