Using Spark in mixed Java/Scala project

2016-01-27 Thread jeremycod
Hi,

I have a mixed Java/Scala project. I have already been using Spark in Scala
code in local mode. Now, some new team members should develop
functionalities that should use Spark but in Java code, and they are not
familiar with Scala. I know it's not possible to have two Spark contexts in
the same application and that having JavaSparkContext instance in addition
to scala SparkContext instance would not work, but I'm wondering if there is
some workaround for this.

Thanks,
Zoran




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-in-mixed-Java-Scala-project-tp26091.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Using Spark in mixed Java/Scala project

2016-01-27 Thread Jakob Odersky
JavaSparkContext has a wrapper constructor for the "scala"
SparkContext. In this case all you need to do is declare a
SparkContext that is accessible both from the Java and Scala sides of
your project and wrap the context with a JavaSparkContext.

Search for java source compatibilty with scala for more information on
how to interface Java with Scala (the other way around is trivial).
Essentially, as long as you declare your SparkContext either in Java
or as a val/var/def in a plain Scala class you are good.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Using Spark in mixed Java/Scala project

2016-01-27 Thread Zoran Jeremic
Hi Jakob,

Thanks a lot for your help. I'll try this.

Zoran

On Wed, Jan 27, 2016 at 10:49 AM, Jakob Odersky  wrote:

> JavaSparkContext has a wrapper constructor for the "scala"
> SparkContext. In this case all you need to do is declare a
> SparkContext that is accessible both from the Java and Scala sides of
> your project and wrap the context with a JavaSparkContext.
>
> Search for java source compatibilty with scala for more information on
> how to interface Java with Scala (the other way around is trivial).
> Essentially, as long as you declare your SparkContext either in Java
> or as a val/var/def in a plain Scala class you are good.
>