Hi Sean,

Thanks a lot for replying and apologies for the late reply(I somehow missed
this mail before) but I am under the impression that passing the py4j.
java_gateway.JavaGateway object lets the pyspark access the spark context
created on the java side.
My use case is exactly what you mentioned in the last email. I want to
access the same spark session across java and pyspark. So how can we share
the spark context and in turn spark session, across java and pyspark.

Regards,
Aditya

On Fri, 26 Mar 2021 at 6:49 PM, Sean Owen <sro...@gmail.com> wrote:

> The problem is that both of these are not sharing a SparkContext as far as
> I can see, so there is no way to share the object across them, let alone
> languages.
>
> You can of course write the data from Java, read it from Python.
>
> In some hosted Spark products, you can access the same session from two
> languages and register the DataFrame as a temp view in Java, then access it
> in Pyspark.
>
>
> On Fri, Mar 26, 2021 at 8:14 AM Aditya Singh <aditya.singh9...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am a newbie to spark and trying to pass a java dataframe to pyspark.
>> Foloowing link has details about what I am trying to do:-
>>
>>
>> https://stackoverflow.com/questions/66797382/creating-pysparks-spark-context-py4j-java-gateway-object
>>
>> Can someone please help me with this?
>>
>> Thanks,
>>
>

Reply via email to