Hi
You can access your java packages using following in pySpark
obj = sc._jvm.yourPackage.className()
Kind Regards
Harsh Takkar
On Wed, Jul 18, 2018 at 4:00 AM Mohit Jaggi wrote:
> Thanks 0xF0F0F0 and Ashutosh for the pointers.
>
> Holden,
> I am trying to look into sparklingml...what am I
Thanks 0xF0F0F0 and Ashutosh for the pointers.
Holden,
I am trying to look into sparklingml...what am I looking for? Also which
chapter/page of your book should I look at?
Mohit.
On Sun, Jul 15, 2018 at 3:02 AM Holden Karau wrote:
> If you want to see some examples in a library shows a way to
If you want to see some examples in a library shows a way to do it -
https://github.com/sparklingpandas/sparklingml and high performance spark
also talks about it.
On Sun, Jul 15, 2018, 11:57 AM <0xf0f...@protonmail.com.invalid> wrote:
> Check
>
Trying again…anyone know how to make this work?
> On Jul 9, 2018, at 3:45 PM, Mohit Jaggi wrote:
>
> Folks,
> I am writing some Scala/Java code and want it to be usable from pyspark.
>
> For example:
> class MyStuff(addend: Int) {
> def myMapFunction(x: Int) = x + addend
> }
>
> I want
Folks,
I am writing some Scala/Java code and want it to be usable from pyspark.
For example:
class MyStuff(addend: Int) {
def myMapFunction(x: Int) = x + addend
}
I want to call it from pyspark as:
df = ...
mystuff = sc._jvm.MyStuff(5)
df[‘x’].map(lambda x: mystuff.myMapFunction(x))