It seems that JavaSparkContext is just a wrapper of scala sparkContext.

In JavaSparkContext, the scala one is used to do all the job.

If I pass the same scala sparkContext to initialize JavaSparkContext, I
still manipulate on the same sparkContext.

Sry for spamming.

Hao

On Mon, Jun 29, 2015 at 11:15 AM, Hao Ren <inv...@gmail.com> wrote:

> Hi,
>
> I am working on legacy project using spark java code.
>
> I have a function which takes sqlContext as an argument, however, I need a
> JavaSparkContext in that function.
>
> It seems that sqlContext.sparkContext() return a scala sparkContext.
>
> I did not find any API for casting a scala sparkContext to a java one
> except
> :
>
> new JavaSparkContext(sqlContext.sparkContext())
>
> I think it will create a new sparkContext. So there will be mutilple
> sparkContext during run time.
>
> According to some posts, there are some limitations on this. But I did not
> encounter that.
>
> Question:
>
> What is the best way to cast a scala sparkContext to a java one ?
> What problem will multiple sparkContext cause ?
>
> Thank you. =)
>
> Hao
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-and-JavaSparkContext-tp23525.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Hao Ren

Data Engineer @ leboncoin

Paris, France

Reply via email to