A SparkContext is thread safe, so you can just have different threads
that create their own RDD's and do actions, etc.

- Patrick

On Mon, Dec 22, 2014 at 4:15 PM, Alessandro Baretta
<alexbare...@gmail.com> wrote:
> Andrew,
>
> Thanks, yes, this is what I wanted: basically just to start multiple jobs
> concurrently in threads.
>
> Alex
>
> On Mon, Dec 22, 2014 at 4:04 PM, Andrew Ash <and...@andrewash.com> wrote:
>>
>> Hi Alex,
>>
>> SparkContext.submitJob() is marked as experimental -- most client programs
>> shouldn't be using it.  What are you looking to do?
>>
>> For multiplexing jobs, one thing you can do is have multiple threads in
>> your client JVM each submit jobs on your SparkContext job.  This is
>> described here in the docs:
>> http://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application
>>
>> Andrew
>>
>> On Mon, Dec 22, 2014 at 1:32 PM, Alessandro Baretta <alexbare...@gmail.com
>> > wrote:
>>
>>> Fellow Sparkers,
>>>
>>> I'm rather puzzled at the submitJob API. I can't quite figure out how it
>>> is
>>> supposed to be used. Is there any more documentation about it?
>>>
>>> Also, is there any simpler way to multiplex jobs on the cluster, such as
>>> starting multiple computations in as many threads in the driver and
>>> reaping
>>> all the results when they are available?
>>>
>>> Thanks,
>>>
>>> Alex
>>>
>>
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to