Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-11 Thread Michael Armbrust
The way that we do this is to have a single context with a server in front that multiplexes jobs that use that shared context. Even if you aren't sharing data this is going to give you the best fine grained sharing of the resources that the context is managing. On Fri, Dec 11, 2015 at 10:55 AM,

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-11 Thread Mike Wright
Thanks for the insight! ___ *Mike Wright* Principal Architect, Software Engineering S Capital IQ and SNL 434-951-7816 *p* 434-244-4466 *f* 540-470-0119 *m* mwri...@snl.com On Fri, Dec 11, 2015 at 2:38 PM, Michael Armbrust wrote: > The way that we do

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-11 Thread Mike Wright
Somewhat related - What's the correct implementation when you have a single cluster to support multiple jobs that are unrelated and NOT sharing data? I was directed to figure out, via job server, to support "multiple contexts" and explained that multiple contexts per JVM is not really supported.

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Ted Yu
See Josh's response in this thread: http://search-hadoop.com/m/q3RTt1z1hUw4TiG1=Re+Question+about+yarn+cluster+mode+and+spark+driver+allowMultipleContexts Cheers On Fri, Dec 4, 2015 at 9:46 AM, prateek arora wrote: > Hi > > I want to create multiple sparkContext in

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Michael Armbrust
On Fri, Dec 4, 2015 at 11:24 AM, Anfernee Xu wrote: > If multiple users are looking at the same data set, then it's good choice > to share the SparkContext. > > But my usercases are different, users are looking at different data(I use > custom Hadoop InputFormat to load

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread prateek arora
Hi Ted Thanks for the information . is there any way that two different spark application share there data ? Regards Prateek On Fri, Dec 4, 2015 at 9:54 AM, Ted Yu wrote: > See Josh's response in this thread: > > >

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread prateek arora
Thanks ... Is there any way my second application run in parallel and wait for fetching data from hbase or any other data storeage system ? Regards Prateek On Fri, Dec 4, 2015 at 10:24 AM, Ted Yu wrote: > How about using NoSQL data store such as HBase :-) > > On Fri, Dec

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Ted Yu
How about using NoSQL data store such as HBase :-) On Fri, Dec 4, 2015 at 10:17 AM, prateek arora wrote: > Hi Ted > Thanks for the information . > is there any way that two different spark application share there data ? > > Regards > Prateek > > On Fri, Dec 4, 2015

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Michael Armbrust
To be clear, I don't think there is ever a compelling reason to create more than one SparkContext in a single application. The context is threadsafe and can launch many jobs in parallel from multiple threads. Even if there wasn't global state that made it unsafe to do so, creating more than one

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Mark Hamstra
Where it could start to make some sense is if you wanted a single application to be able to work with more than one Spark cluster -- but that's a pretty weird or unusual thing to do, and I'm pretty sure it wouldn't work correctly at present. On Fri, Dec 4, 2015 at 11:10 AM, Michael Armbrust

Re: is Multiple Spark Contexts is supported in spark 1.5.0 ?

2015-12-04 Thread Anfernee Xu
If multiple users are looking at the same data set, then it's good choice to share the SparkContext. But my usercases are different, users are looking at different data(I use custom Hadoop InputFormat to load data from my data source based on the user input), the data might not have any overlap.