Hi,

I have a Spark project in Scala and I would like to call some Python
functions from within the program.
Both parts are quite big, so re-coding everything in one language is not
really an option.

The workflow would be:
- Creating a RDD with Scala code
- Mapping a Python function over this RDD
- Using the result directly in Scala

I've read about PySpark internals, but that didn't help much.
Is it possible to do so, and preferably in an efficent manner ?

Cheers,
Didier



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Calling-Python-code-from-Scala-tp26798.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to