there are some “hidden” APIs potentially addressing your problem (but with a 
bit complexity)

by using the Actor Receiver, you can tell the supervisor of the actor receiver 
create another actor receiver for you, the ActorRef of the newly created Actor 
will be sent to the caller of the API (in most of cases, that’s one of the 
existing actor receivers)

The limitation might be that,  

all receivers are on the same machine...


Here is a PR trying to expose the APIs to the user: 
https://github.com/apache/spark/pull/3984

Best,  

--  
Nan Zhu
http://codingcat.me


On Monday, March 2, 2015 at 10:19 AM, Tamas Jambor wrote:

> Sorry, I meant once the stream is started, it's not possible to create new 
> streams in the existing streaming context, and it's not possible to create 
> new streaming context if another one is already running.
> So the only feasible option seemed to create new sparkcontexts for each 
> stream (tried using spark-jobserver for that).
>  
>  
> On Mon, Mar 2, 2015 at 3:07 PM, Sean Owen <so...@cloudera.com 
> (mailto:so...@cloudera.com)> wrote:
> > You can make a new StreamingContext on an existing SparkContext, I believe?
> >  
> > On Mon, Mar 2, 2015 at 3:01 PM, Tamas Jambor <jambo...@gmail.com 
> > (mailto:jambo...@gmail.com)> wrote:
> > > thanks for the reply.
> > >
> > > Actually, our main problem is not really about sparkcontext, the problem 
> > > is
> > > that spark does not allow to create streaming context dynamically, and 
> > > once
> > > a stream is shut down, a new one cannot be created in the same 
> > > sparkcontext.
> > > So we cannot create a service that would create and manage multiple 
> > > streams
> > > - the same way that is possible with batch jobs.
> > >
> > > On Mon, Mar 2, 2015 at 2:52 PM, Sean Owen <so...@cloudera.com 
> > > (mailto:so...@cloudera.com)> wrote:
> > >>
> > >> I think everything there is to know about it is on JIRA; I don't think
> > >> that's being worked on.
> > >>
> > >> On Mon, Mar 2, 2015 at 2:50 PM, Tamas Jambor <jambo...@gmail.com 
> > >> (mailto:jambo...@gmail.com)> wrote:
> > >> > I have seen there is a card (SPARK-2243) to enable that. Is that still
> > >> > going
> > >> > ahead?
> > >> >
> > >> > On Mon, Mar 2, 2015 at 2:46 PM, Sean Owen <so...@cloudera.com 
> > >> > (mailto:so...@cloudera.com)> wrote:
> > >> >>
> > >> >> It is still not something you're supposed to do; in fact there is a
> > >> >> setting (disabled by default) that throws an exception if you try to
> > >> >> make multiple contexts.
> > >> >>
> > >> >> On Mon, Mar 2, 2015 at 2:43 PM, jamborta <jambo...@gmail.com 
> > >> >> (mailto:jambo...@gmail.com)> wrote:
> > >> >> > hi all,
> > >> >> >
> > >> >> > what is the current status and direction on enabling multiple
> > >> >> > sparkcontexts
> > >> >> > and streamingcontext? I have seen a few issues open on JIRA, which
> > >> >> > seem
> > >> >> > to
> > >> >> > be there for quite a while.
> > >> >> >
> > >> >> > thanks,
> > >> >> >
> > >> >> >
> > >> >> >
> > >> >> > --
> > >> >> > View this message in context:
> > >> >> >
> > >> >> > http://apache-spark-user-list.1001560.n3.nabble.com/multiple-sparkcontexts-and-streamingcontexts-tp21876.html
> > >> >> > Sent from the Apache Spark User List mailing list archive at
> > >> >> > Nabble.com (http://Nabble.com).
> > >> >> >
> > >> >> > ---------------------------------------------------------------------
> > >> >> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> > >> >> > (mailto:user-unsubscr...@spark.apache.org)
> > >> >> > For additional commands, e-mail: user-h...@spark.apache.org 
> > >> >> > (mailto:user-h...@spark.apache.org)
> > >> >> >
> > >> >
> > >> >
> > >
> > >
>  

Reply via email to