Is there any way in which my application can connect to multiple Spark Clusters?
Or is communication between Spark clusters possible?

Regards,
Jasbir

From: Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
Sent: Monday, January 30, 2017 1:33 PM
To: vincent gromakowski <vincent.gromakow...@gmail.com>
Cc: Rohit Verma <rohit.ve...@rokittech.com>; user@spark.apache.org; Sing, 
Jasbir <jasbir.s...@accenture.com>; Mark Hamstra <m...@clearstorydata.com>
Subject: Re: Having multiple spark context

in general in a single JVM which is basically running in Local mode, you have 
only one Spark Context. However, you can stop the current Spark Context by

sc.stop()

HTH


Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.linkedin.com_profile_view-3Fid-3DAAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=-93uK1fLHXAB9IQsZMgoyrVmnLqhX4hDF_U9TCeA1zI&e=>



http://talebzadehmich.wordpress.com<https://urldefense.proofpoint.com/v2/url?u=http-3A__talebzadehmich.wordpress.com&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=RJaLtMVl6AjX-BhN2GY5RacY0XzCuwLhC9fYy_HrR9A&e=>



Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.



On 30 January 2017 at 07:54, vincent gromakowski 
<vincent.gromakow...@gmail.com<mailto:vincent.gromakow...@gmail.com>> wrote:

A clustering lib is necessary to manage multiple jvm. Akka cluster for instance

Le 30 janv. 2017 8:01 AM, "Rohit Verma" 
<rohit.ve...@rokittech.com<mailto:rohit.ve...@rokittech.com>> a écrit :
Hi,

If I am right, you need to launch other context from another jvm. If you are 
trying to launch from same jvm another context it will return you the existing 
context.

Rohit
On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
<m...@clearstorydata.com<mailto:m...@clearstorydata.com>> wrote:

More than one Spark Context in a single Application is not supported.

On Sun, Jan 29, 2017 at 9:08 PM, 
<jasbir.s...@accenture.com<mailto:jasbir.s...@accenture.com>> wrote:
Hi,

I have a requirement in which, my application creates one Spark context in 
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one 
SparkContext (created in Distributed mode). Second spark context is not getting 
created.

Can you please help me out in how to create two spark contexts.

Regards,
Jasbir singh

________________________________

This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy.
______________________________________________________________________________________

www.accenture.com<http://www.accenture.com/>



Reply via email to