Re: Having multiple spark context

2017-01-30 Thread Rohit Verma
Two ways,

1. There is an experimental support for this. Read at 
https://issues.apache.org/jira/browse/SPARK-2243. Afraid you might need to 
build spark from source code.
2. Use middleware. Deploy two apps separately communicating with your app over 
messaging/rest.

Regards
Rohit

On Jan 30, 2017, at 2:07 PM, 
jasbir.s...@accenture.com<mailto:jasbir.s...@accenture.com> wrote:

Is there any way in which my application can connect to multiple Spark Clusters?
Or is communication between Spark clusters possible?

Regards,
Jasbir

From: Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
Sent: Monday, January 30, 2017 1:33 PM
To: vincent gromakowski 
mailto:vincent.gromakow...@gmail.com>>
Cc: Rohit Verma mailto:rohit.ve...@rokittech.com>>; 
user@spark.apache.org<mailto:user@spark.apache.org>; Sing, Jasbir 
mailto:jasbir.s...@accenture.com>>; Mark Hamstra 
mailto:m...@clearstorydata.com>>
Subject: Re: Having multiple spark context

in general in a single JVM which is basically running in Local mode, you have 
only one Spark Context. However, you can stop the current Spark Context by

sc.stop()

HTH

Dr Mich Talebzadeh

LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.linkedin.com_profile_view-3Fid-3DAAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=-93uK1fLHXAB9IQsZMgoyrVmnLqhX4hDF_U9TCeA1zI&e=>

http://talebzadehmich.wordpress.com<https://urldefense.proofpoint.com/v2/url?u=http-3A__talebzadehmich.wordpress.com&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=RJaLtMVl6AjX-BhN2GY5RacY0XzCuwLhC9fYy_HrR9A&e=>

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.


On 30 January 2017 at 07:54, vincent gromakowski 
mailto:vincent.gromakow...@gmail.com>> wrote:

A clustering lib is necessary to manage multiple jvm. Akka cluster for instance


Le 30 janv. 2017 8:01 AM, "Rohit Verma" 
mailto:rohit.ve...@rokittech.com>> a écrit :
Hi,

If I am right, you need to launch other context from another jvm. If you are 
trying to launch from same jvm another context it will return you the existing 
context.

Rohit
On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
mailto:m...@clearstorydata.com>> wrote:

More than one Spark Context in a single Application is not supported.

On Sun, Jan 29, 2017 at 9:08 PM, 
mailto:jasbir.s...@accenture.com>> wrote:
Hi,

I have a requirement in which, my application creates one Spark context in 
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one 
SparkContext (created in Distributed mode). Second spark context is not getting 
created.

Can you please help me out in how to create two spark contexts.

Regards,
Jasbir singh



This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy.
__

www.accenture.com<http://www.accenture.com/>



RE: Having multiple spark context

2017-01-30 Thread jasbir.sing
Is there any way in which my application can connect to multiple Spark Clusters?
Or is communication between Spark clusters possible?

Regards,
Jasbir

From: Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
Sent: Monday, January 30, 2017 1:33 PM
To: vincent gromakowski 
Cc: Rohit Verma ; user@spark.apache.org; Sing, 
Jasbir ; Mark Hamstra 
Subject: Re: Having multiple spark context

in general in a single JVM which is basically running in Local mode, you have 
only one Spark Context. However, you can stop the current Spark Context by

sc.stop()

HTH


Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.linkedin.com_profile_view-3Fid-3DAAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=-93uK1fLHXAB9IQsZMgoyrVmnLqhX4hDF_U9TCeA1zI&e=>



http://talebzadehmich.wordpress.com<https://urldefense.proofpoint.com/v2/url?u=http-3A__talebzadehmich.wordpress.com&d=DwMFaQ&c=eIGjsITfXP_y-DLLX0uEHXJvU8nOHrUK8IrwNKOtkVU&r=7scIIjM0jY9x3fjvY6a_yERLxMA2NwA8l0DnuyrL6yA&m=bk_C2oASI96fBXUo_StRKb6IIoTjrXJq18q4FxtVHGo&s=RJaLtMVl6AjX-BhN2GY5RacY0XzCuwLhC9fYy_HrR9A&e=>



Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction.



On 30 January 2017 at 07:54, vincent gromakowski 
mailto:vincent.gromakow...@gmail.com>> wrote:

A clustering lib is necessary to manage multiple jvm. Akka cluster for instance

Le 30 janv. 2017 8:01 AM, "Rohit Verma" 
mailto:rohit.ve...@rokittech.com>> a écrit :
Hi,

If I am right, you need to launch other context from another jvm. If you are 
trying to launch from same jvm another context it will return you the existing 
context.

Rohit
On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
mailto:m...@clearstorydata.com>> wrote:

More than one Spark Context in a single Application is not supported.

On Sun, Jan 29, 2017 at 9:08 PM, 
mailto:jasbir.s...@accenture.com>> wrote:
Hi,

I have a requirement in which, my application creates one Spark context in 
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one 
SparkContext (created in Distributed mode). Second spark context is not getting 
created.

Can you please help me out in how to create two spark contexts.

Regards,
Jasbir singh



This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy.
__

www.accenture.com<http://www.accenture.com/>





Re: Having multiple spark context

2017-01-30 Thread Mich Talebzadeh
in general in a single JVM which is basically running in Local mode, you
have only one Spark Context. However, you can stop the current Spark
Context by

sc.stop()

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 30 January 2017 at 07:54, vincent gromakowski <
vincent.gromakow...@gmail.com> wrote:

> A clustering lib is necessary to manage multiple jvm. Akka cluster for
> instance
>
> Le 30 janv. 2017 8:01 AM, "Rohit Verma"  a
> écrit :
>
>> Hi,
>>
>> If I am right, you need to launch other context from another jvm. If you
>> are trying to launch from same jvm another context it will return you the
>> existing context.
>>
>> Rohit
>>
>> On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
>> wrote:
>>
>> More than one Spark Context in a single Application is not supported.
>>
>> On Sun, Jan 29, 2017 at 9:08 PM,  wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> I have a requirement in which, my application creates one Spark context
>>> in Distributed mode whereas another Spark context in local mode.
>>>
>>> When I am creating this, my complete application is working on only one
>>> SparkContext (created in Distributed mode). Second spark context is not
>>> getting created.
>>>
>>>
>>>
>>> Can you please help me out in how to create two spark contexts.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Jasbir singh
>>>
>>> --
>>>
>>> This message is for the designated recipient only and may contain
>>> privileged, proprietary, or otherwise confidential information. If you have
>>> received it in error, please notify the sender immediately and delete the
>>> original. Any other use of the e-mail by you is prohibited. Where allowed
>>> by local law, electronic communications with Accenture and its affiliates,
>>> including e-mail and instant messaging (including content), may be scanned
>>> by our systems for the purposes of information security and assessment of
>>> internal compliance with Accenture policy.
>>> 
>>> __
>>>
>>> www.accenture.com
>>>
>>
>>
>>


Re: Having multiple spark context

2017-01-29 Thread vincent gromakowski
A clustering lib is necessary to manage multiple jvm. Akka cluster for
instance

Le 30 janv. 2017 8:01 AM, "Rohit Verma"  a
écrit :

> Hi,
>
> If I am right, you need to launch other context from another jvm. If you
> are trying to launch from same jvm another context it will return you the
> existing context.
>
> Rohit
>
> On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
> wrote:
>
> More than one Spark Context in a single Application is not supported.
>
> On Sun, Jan 29, 2017 at 9:08 PM,  wrote:
>
>> Hi,
>>
>>
>>
>> I have a requirement in which, my application creates one Spark context
>> in Distributed mode whereas another Spark context in local mode.
>>
>> When I am creating this, my complete application is working on only one
>> SparkContext (created in Distributed mode). Second spark context is not
>> getting created.
>>
>>
>>
>> Can you please help me out in how to create two spark contexts.
>>
>>
>>
>> Regards,
>>
>> Jasbir singh
>>
>> --
>>
>> This message is for the designated recipient only and may contain
>> privileged, proprietary, or otherwise confidential information. If you have
>> received it in error, please notify the sender immediately and delete the
>> original. Any other use of the e-mail by you is prohibited. Where allowed
>> by local law, electronic communications with Accenture and its affiliates,
>> including e-mail and instant messaging (including content), may be scanned
>> by our systems for the purposes of information security and assessment of
>> internal compliance with Accenture policy.
>> 
>> __
>>
>> www.accenture.com
>>
>
>
>


Re: Having multiple spark context

2017-01-29 Thread Rohit Verma
Hi,

If I am right, you need to launch other context from another jvm. If you are 
trying to launch from same jvm another context it will return you the existing 
context.

Rohit
On Jan 30, 2017, at 12:24 PM, Mark Hamstra 
mailto:m...@clearstorydata.com>> wrote:

More than one Spark Context in a single Application is not supported.

On Sun, Jan 29, 2017 at 9:08 PM, 
mailto:jasbir.s...@accenture.com>> wrote:
Hi,

I have a requirement in which, my application creates one Spark context in 
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one 
SparkContext (created in Distributed mode). Second spark context is not getting 
created.

Can you please help me out in how to create two spark contexts.

Regards,
Jasbir singh



This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy.
__

www.accenture.com




Re: Having multiple spark context

2017-01-29 Thread Mark Hamstra
More than one Spark Context in a single Application is not supported.

On Sun, Jan 29, 2017 at 9:08 PM,  wrote:

> Hi,
>
>
>
> I have a requirement in which, my application creates one Spark context in
> Distributed mode whereas another Spark context in local mode.
>
> When I am creating this, my complete application is working on only one
> SparkContext (created in Distributed mode). Second spark context is not
> getting created.
>
>
>
> Can you please help me out in how to create two spark contexts.
>
>
>
> Regards,
>
> Jasbir singh
>
> --
>
> This message is for the designated recipient only and may contain
> privileged, proprietary, or otherwise confidential information. If you have
> received it in error, please notify the sender immediately and delete the
> original. Any other use of the e-mail by you is prohibited. Where allowed
> by local law, electronic communications with Accenture and its affiliates,
> including e-mail and instant messaging (including content), may be scanned
> by our systems for the purposes of information security and assessment of
> internal compliance with Accenture policy.
> 
> __
>
> www.accenture.com
>


Having multiple spark context

2017-01-29 Thread jasbir.sing
Hi,

I have a requirement in which, my application creates one Spark context in 
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one 
SparkContext (created in Distributed mode). Second spark context is not getting 
created.

Can you please help me out in how to create two spark contexts.

Regards,
Jasbir singh



This message is for the designated recipient only and may contain privileged, 
proprietary, or otherwise confidential information. If you have received it in 
error, please notify the sender immediately and delete the original. Any other 
use of the e-mail by you is prohibited. Where allowed by local law, electronic 
communications with Accenture and its affiliates, including e-mail and instant 
messaging (including content), may be scanned by our systems for the purposes 
of information security and assessment of internal compliance with Accenture 
policy.
__

www.accenture.com