Re: Pyspark access to scala/java libraries

2018-07-18 Thread HARSH TAKKAR
Hi

You can access your java packages using following in pySpark

obj = sc._jvm.yourPackage.className()


Kind Regards
Harsh Takkar

On Wed, Jul 18, 2018 at 4:00 AM Mohit Jaggi  wrote:

> Thanks 0xF0F0F0 and Ashutosh for the pointers.
>
> Holden,
> I am trying to look into sparklingml...what am I looking for? Also which
> chapter/page of your book should I look at?
>
> Mohit.
>
> On Sun, Jul 15, 2018 at 3:02 AM Holden Karau 
> wrote:
>
>> If you want to see some examples in a library shows a way to do it -
>> https://github.com/sparklingpandas/sparklingml and high performance
>> spark also talks about it.
>>
>> On Sun, Jul 15, 2018, 11:57 AM <0xf0f...@protonmail.com.invalid> wrote:
>>
>>> Check
>>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>>
>>> ​Sent with ProtonMail Secure Email.​
>>>
>>> ‐‐‐ Original Message ‐‐‐
>>>
>>> On July 15, 2018 8:01 AM, Mohit Jaggi  wrote:
>>>
>>> > Trying again…anyone know how to make this work?
>>> >
>>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitja...@gmail.com wrote:
>>> > >
>>> > > Folks,
>>> > >
>>> > > I am writing some Scala/Java code and want it to be usable from
>>> pyspark.
>>> > >
>>> > > For example:
>>> > >
>>> > > class MyStuff(addend: Int) {
>>> > >
>>> > > def myMapFunction(x: Int) = x + addend
>>> > >
>>> > > }
>>> > >
>>> > > I want to call it from pyspark as:
>>> > >
>>> > > df = ...
>>> > >
>>> > > mystuff = sc._jvm.MyStuff(5)
>>> > >
>>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>>> > >
>>> > > How can I do this?
>>> > >
>>> > > Mohit.
>>> >
>>> > --
>>> >
>>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>


Re: Pyspark access to scala/java libraries

2018-07-17 Thread Mohit Jaggi
Thanks 0xF0F0F0 and Ashutosh for the pointers.

Holden,
I am trying to look into sparklingml...what am I looking for? Also which
chapter/page of your book should I look at?

Mohit.

On Sun, Jul 15, 2018 at 3:02 AM Holden Karau  wrote:

> If you want to see some examples in a library shows a way to do it -
> https://github.com/sparklingpandas/sparklingml and high performance spark
> also talks about it.
>
> On Sun, Jul 15, 2018, 11:57 AM <0xf0f...@protonmail.com.invalid> wrote:
>
>> Check
>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>
>> ​Sent with ProtonMail Secure Email.​
>>
>> ‐‐‐ Original Message ‐‐‐
>>
>> On July 15, 2018 8:01 AM, Mohit Jaggi  wrote:
>>
>> > Trying again…anyone know how to make this work?
>> >
>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitja...@gmail.com wrote:
>> > >
>> > > Folks,
>> > >
>> > > I am writing some Scala/Java code and want it to be usable from
>> pyspark.
>> > >
>> > > For example:
>> > >
>> > > class MyStuff(addend: Int) {
>> > >
>> > > def myMapFunction(x: Int) = x + addend
>> > >
>> > > }
>> > >
>> > > I want to call it from pyspark as:
>> > >
>> > > df = ...
>> > >
>> > > mystuff = sc._jvm.MyStuff(5)
>> > >
>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>> > >
>> > > How can I do this?
>> > >
>> > > Mohit.
>> >
>> > --
>> >
>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: Pyspark access to scala/java libraries

2018-07-15 Thread Holden Karau
If you want to see some examples in a library shows a way to do it -
https://github.com/sparklingpandas/sparklingml and high performance spark
also talks about it.

On Sun, Jul 15, 2018, 11:57 AM <0xf0f...@protonmail.com.invalid> wrote:

> Check
> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>
> ​Sent with ProtonMail Secure Email.​
>
> ‐‐‐ Original Message ‐‐‐
>
> On July 15, 2018 8:01 AM, Mohit Jaggi  wrote:
>
> > Trying again…anyone know how to make this work?
> >
> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitja...@gmail.com wrote:
> > >
> > > Folks,
> > >
> > > I am writing some Scala/Java code and want it to be usable from
> pyspark.
> > >
> > > For example:
> > >
> > > class MyStuff(addend: Int) {
> > >
> > > def myMapFunction(x: Int) = x + addend
> > >
> > > }
> > >
> > > I want to call it from pyspark as:
> > >
> > > df = ...
> > >
> > > mystuff = sc._jvm.MyStuff(5)
> > >
> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > >
> > > How can I do this?
> > >
> > > Mohit.
> >
> > --
> >
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Pyspark access to scala/java libraries

2018-07-15 Thread Mohit Jaggi
Trying again…anyone know how to make this work?

> On Jul 9, 2018, at 3:45 PM, Mohit Jaggi  wrote:
> 
> Folks,
> I am writing some Scala/Java code and want it to be usable from pyspark.
> 
> For example:
> class MyStuff(addend: Int)  {
>   def myMapFunction(x: Int) = x + addend
> }
> 
> I want to call it from pyspark as:
> 
> df = ...
> mystuff = sc._jvm.MyStuff(5)
> df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> 
> How can I do this?
> 
> Mohit.
> 
> 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Pyspark access to scala/java libraries

2018-07-09 Thread Mohit Jaggi
Folks,
I am writing some Scala/Java code and want it to be usable from pyspark.

For example:
class MyStuff(addend: Int)  {
def myMapFunction(x: Int) = x + addend
}

I want to call it from pyspark as:

df = ...
mystuff = sc._jvm.MyStuff(5)
df[‘x’].map(lambda x: mystuff.myMapFunction(x))

How can I do this?

Mohit.



-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org