, i.e., full
> aggregate over rows and columns.
>
> ad 3: lix: matrix or frame left indexing (for patterns like X[a:b,c:d] =
> ...)
>
> ad 4: u(cast_as_scalar): unary cast from matrix or frame to scalar of
> value type double.
>
> Regards,
> Matthias
>
> On 2/17/2
Hello,
I generated a HOP plan using -explain, but I can't find the meaning of the
following operators:
1. t(-*)
2. ua(+RC)
3. lix
4. u(cast_as_scalar)
Thank you in advance,
Nantia
kshop:
>
> http://boss.dima.tu-berlin.de/media/BOSS16-Tutorial-mboehm.pdf
> (slides 10-15)
>
> If you have questions beyond that, please just ask. This is also very
> helpful for us for future improvements of our documentation.
>
>
> Regards,
> Matthias
>
&
Hello,
are there any resources explaining the HOP and LOP DAGs generated by
-explain?
Thanks a lot,
Nantia
f you send the error message as I am guessing the setup.
>
> Thanks,
>
> Niketan
>
> > On Nov 8, 2016, at 2:49 AM, Nantia Makrynioti
> wrote:
> >
> > Hello,
> >
> > I am trying to run the PyDML script below using the Spark ML Context.
> >
&g
Hello,
I am trying to run the PyDML script below using the Spark ML Context.
import systemml as smlimport numpy as npsml.setSparkContext(sc)m1 =
sml.matrix(np.ones((3,3)) + 2)m2 = sml.matrix(np.ones((3,3)) + 3)m2 =
m1 * (m2 + m1)m4 = 1.0 - m2m4.sum(axis=1).toNumPyArray()
I start Spark Shell and