Folks,
I am writing some Scala/Java code and want it to be usable from pyspark.

For example:
class MyStuff(addend: Int)  {
        def myMapFunction(x: Int) = x + addend
}

I want to call it from pyspark as:

df = ...
mystuff = sc._jvm.MyStuff(5)
df[‘x’].map(lambda x: mystuff.myMapFunction(x))

How can I do this?

Mohit.



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to