Hi, 

I am working on legacy project using spark java code.

I have a function which takes sqlContext as an argument, however, I need a
JavaSparkContext in that function.

It seems that sqlContext.sparkContext() return a scala sparkContext.

I did not find any API for casting a scala sparkContext to a java one except
:

new JavaSparkContext(sqlContext.sparkContext())

I think it will create a new sparkContext. So there will be mutilple
sparkContext during run time.

According to some posts, there are some limitations on this. But I did not
encounter that.

Question:

What is the best way to cast a scala sparkContext to a java one ?
What problem will multiple sparkContext cause ?

Thank you. =)

Hao



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-and-JavaSparkContext-tp23525.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to