To be clear, I don't think there is ever a compelling reason to create more
than one SparkContext in a single application.  The context is threadsafe
and can launch many jobs in parallel from multiple threads.  Even if there
wasn't global state that made it unsafe to do so, creating more than one
context creates an artificial barrier that prevents sharing of RDDs between
the two.

On Fri, Dec 4, 2015 at 10:47 AM, prateek arora <prateek.arora...@gmail.com>
wrote:

> Thanks ...
>
> Is there any way my second application run in parallel and wait for
> fetching data from hbase or any other data storeage system ?
>
> Regards
> Prateek
>
> On Fri, Dec 4, 2015 at 10:24 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> How about using NoSQL data store such as HBase :-)
>>
>> On Fri, Dec 4, 2015 at 10:17 AM, prateek arora <
>> prateek.arora...@gmail.com> wrote:
>>
>>> Hi Ted
>>> Thanks for the information .
>>> is there any way that two different spark application share there data ?
>>>
>>> Regards
>>> Prateek
>>>
>>> On Fri, Dec 4, 2015 at 9:54 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> See Josh's response in this thread:
>>>>
>>>>
>>>> http://search-hadoop.com/m/q3RTt1z1hUw4TiG1&subj=Re+Question+about+yarn+cluster+mode+and+spark+driver+allowMultipleContexts
>>>>
>>>> Cheers
>>>>
>>>> On Fri, Dec 4, 2015 at 9:46 AM, prateek arora <
>>>> prateek.arora...@gmail.com> wrote:
>>>>
>>>>> Hi
>>>>>
>>>>> I want to create multiple sparkContext in my application.
>>>>> i read so many articles they suggest " usage of multiple contexts is
>>>>> discouraged, since SPARK-2243 is still not resolved."
>>>>> i want to know that Is spark 1.5.0 supported to create multiple
>>>>> contexts
>>>>> without error ?
>>>>> and if supported then are we need to set
>>>>> "spark.driver.allowMultipleContexts" configuration parameter ?
>>>>>
>>>>> Regards
>>>>> Prateek
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/is-Multiple-Spark-Contexts-is-supported-in-spark-1-5-0-tp25568.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to