thanks for the reply.

Actually, our main problem is not really about sparkcontext, the problem is
that spark does not allow to create streaming context dynamically, and once
a stream is shut down, a new one cannot be created in the same
sparkcontext. So we cannot create a service that would create and manage
multiple streams - the same way that is possible with batch jobs.

On Mon, Mar 2, 2015 at 2:52 PM, Sean Owen <so...@cloudera.com> wrote:

> I think everything there is to know about it is on JIRA; I don't think
> that's being worked on.
>
> On Mon, Mar 2, 2015 at 2:50 PM, Tamas Jambor <jambo...@gmail.com> wrote:
> > I have seen there is a card (SPARK-2243) to enable that. Is that still
> going
> > ahead?
> >
> > On Mon, Mar 2, 2015 at 2:46 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> It is still not something you're supposed to do; in fact there is a
> >> setting (disabled by default) that throws an exception if you try to
> >> make multiple contexts.
> >>
> >> On Mon, Mar 2, 2015 at 2:43 PM, jamborta <jambo...@gmail.com> wrote:
> >> > hi all,
> >> >
> >> > what is the current status and direction on enabling multiple
> >> > sparkcontexts
> >> > and streamingcontext? I have seen a few issues open on JIRA, which
> seem
> >> > to
> >> > be there for quite a while.
> >> >
> >> > thanks,
> >> >
> >> >
> >> >
> >> > --
> >> > View this message in context:
> >> >
> http://apache-spark-user-list.1001560.n3.nabble.com/multiple-sparkcontexts-and-streamingcontexts-tp21876.html
> >> > Sent from the Apache Spark User List mailing list archive at
> Nabble.com.
> >> >
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> > For additional commands, e-mail: user-h...@spark.apache.org
> >> >
> >
> >
>

Reply via email to