in
the same application and that having JavaSparkContext instance in addition
to scala SparkContext instance would not work, but I'm wondering if there is
some workaround for this.
Thanks,
Zoran
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-in-mixed
JavaSparkContext has a wrapper constructor for the "scala"
SparkContext. In this case all you need to do is declare a
SparkContext that is accessible both from the Java and Scala sides of
your project and wrap the context with a JavaSparkContext.
Search for java source compatibilty with scala for
Hi Jakob,
Thanks a lot for your help. I'll try this.
Zoran
On Wed, Jan 27, 2016 at 10:49 AM, Jakob Odersky wrote:
> JavaSparkContext has a wrapper constructor for the "scala"
> SparkContext. In this case all you need to do is declare a
> SparkContext that is accessible both