Using Spark in mixed Java/Scala project

2016-01-27 Thread jeremycod
in the same application and that having JavaSparkContext instance in addition to scala SparkContext instance would not work, but I'm wondering if there is some workaround for this. Thanks, Zoran -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-in-mixed

Re: Using Spark in mixed Java/Scala project

2016-01-27 Thread Jakob Odersky
JavaSparkContext has a wrapper constructor for the "scala" SparkContext. In this case all you need to do is declare a SparkContext that is accessible both from the Java and Scala sides of your project and wrap the context with a JavaSparkContext. Search for java source compatibilty with scala for

Re: Using Spark in mixed Java/Scala project

2016-01-27 Thread Zoran Jeremic
Hi Jakob, Thanks a lot for your help. I'll try this. Zoran On Wed, Jan 27, 2016 at 10:49 AM, Jakob Odersky wrote: > JavaSparkContext has a wrapper constructor for the "scala" > SparkContext. In this case all you need to do is declare a > SparkContext that is accessible both