Re: dynamically change receiver for a spark stream

2015-01-21 Thread Tamas Jambor
Hi Gerard,

thanks, that makes sense. I'll try that out.

Tamas

On Wed, Jan 21, 2015 at 11:14 AM, Gerard Maas  wrote:

> Hi Tamas,
>
> I meant not changing the receivers, but starting/stopping the Streaming
> jobs. So you would have a 'small' Streaming job for a subset of streams
> that you'd configure->start->stop  on demand.
> I haven't tried myself yet, but I think it should also be possible to
> create a Streaming Job from the Spark Job Server (
> https://github.com/spark-jobserver/spark-jobserver). Then you would have
> a REST interface that even gives you the possibility of passing a
> configuration.
>
> -kr, Gerard.
>
> On Wed, Jan 21, 2015 at 11:54 AM, Tamas Jambor  wrote:
>
>> we were thinking along the same line, that is to fix the number of
>> streams and change the input and output channels dynamically.
>>
>> But could not make it work (seems that the receiver is not allowing any
>> change in the config after it started).
>>
>> thanks,
>>
>> On Wed, Jan 21, 2015 at 10:49 AM, Gerard Maas 
>> wrote:
>>
>>> One possible workaround could be to orchestrate launch/stopping of
>>> Streaming jobs on demand as long as the number of jobs/streams stay
>>> within the boundaries of the resources (cores) you've available.
>>> e.g. if you're using Mesos, Marathon offers a REST interface to manage
>>> job lifecycle. You will still need to solve the dynamic configuration
>>> through some alternative channel.
>>>
>>> On Wed, Jan 21, 2015 at 11:30 AM, Tamas Jambor 
>>> wrote:
>>>
 thanks for the replies.

 is this something we can get around? Tried to hack into the code
 without much success.

 On Wed, Jan 21, 2015 at 3:15 AM, Shao, Saisai 
 wrote:

> Hi,
>
> I don't think current Spark Streaming support this feature, all the
> DStream lineage is fixed after the context is started.
>
> Also stopping a stream is not supported, instead currently we need to
> stop the whole streaming context to meet what you want.
>
> Thanks
> Saisai
>
> -Original Message-
> From: jamborta [mailto:jambo...@gmail.com]
> Sent: Wednesday, January 21, 2015 3:09 AM
> To: user@spark.apache.org
> Subject: dynamically change receiver for a spark stream
>
> Hi all,
>
> we have been trying to setup a stream using a custom receiver that
> would pick up data from sql databases. we'd like to keep that stream
> context running and dynamically change the streams on demand, adding and
> removing streams based on demand. alternativel, if a stream is fixed, is 
> it
> possible to stop a stream, change to config and start again?
>
> thanks,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
> Sent from the Apache Spark User List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For
> additional commands, e-mail: user-h...@spark.apache.org
>
>

>>>
>>
>


Re: dynamically change receiver for a spark stream

2015-01-21 Thread Gerard Maas
Hi Tamas,

I meant not changing the receivers, but starting/stopping the Streaming
jobs. So you would have a 'small' Streaming job for a subset of streams
that you'd configure->start->stop  on demand.
I haven't tried myself yet, but I think it should also be possible to
create a Streaming Job from the Spark Job Server (
https://github.com/spark-jobserver/spark-jobserver). Then you would have a
REST interface that even gives you the possibility of passing a
configuration.

-kr, Gerard.

On Wed, Jan 21, 2015 at 11:54 AM, Tamas Jambor  wrote:

> we were thinking along the same line, that is to fix the number of streams
> and change the input and output channels dynamically.
>
> But could not make it work (seems that the receiver is not allowing any
> change in the config after it started).
>
> thanks,
>
> On Wed, Jan 21, 2015 at 10:49 AM, Gerard Maas 
> wrote:
>
>> One possible workaround could be to orchestrate launch/stopping of
>> Streaming jobs on demand as long as the number of jobs/streams stay
>> within the boundaries of the resources (cores) you've available.
>> e.g. if you're using Mesos, Marathon offers a REST interface to manage
>> job lifecycle. You will still need to solve the dynamic configuration
>> through some alternative channel.
>>
>> On Wed, Jan 21, 2015 at 11:30 AM, Tamas Jambor 
>> wrote:
>>
>>> thanks for the replies.
>>>
>>> is this something we can get around? Tried to hack into the code without
>>> much success.
>>>
>>> On Wed, Jan 21, 2015 at 3:15 AM, Shao, Saisai 
>>> wrote:
>>>
 Hi,

 I don't think current Spark Streaming support this feature, all the
 DStream lineage is fixed after the context is started.

 Also stopping a stream is not supported, instead currently we need to
 stop the whole streaming context to meet what you want.

 Thanks
 Saisai

 -Original Message-
 From: jamborta [mailto:jambo...@gmail.com]
 Sent: Wednesday, January 21, 2015 3:09 AM
 To: user@spark.apache.org
 Subject: dynamically change receiver for a spark stream

 Hi all,

 we have been trying to setup a stream using a custom receiver that
 would pick up data from sql databases. we'd like to keep that stream
 context running and dynamically change the streams on demand, adding and
 removing streams based on demand. alternativel, if a stream is fixed, is it
 possible to stop a stream, change to config and start again?

 thanks,



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For
 additional commands, e-mail: user-h...@spark.apache.org


>>>
>>
>


Re: dynamically change receiver for a spark stream

2015-01-21 Thread Tamas Jambor
we were thinking along the same line, that is to fix the number of streams
and change the input and output channels dynamically.

But could not make it work (seems that the receiver is not allowing any
change in the config after it started).

thanks,

On Wed, Jan 21, 2015 at 10:49 AM, Gerard Maas  wrote:

> One possible workaround could be to orchestrate launch/stopping of
> Streaming jobs on demand as long as the number of jobs/streams stay
> within the boundaries of the resources (cores) you've available.
> e.g. if you're using Mesos, Marathon offers a REST interface to manage job
> lifecycle. You will still need to solve the dynamic configuration through
> some alternative channel.
>
> On Wed, Jan 21, 2015 at 11:30 AM, Tamas Jambor  wrote:
>
>> thanks for the replies.
>>
>> is this something we can get around? Tried to hack into the code without
>> much success.
>>
>> On Wed, Jan 21, 2015 at 3:15 AM, Shao, Saisai 
>> wrote:
>>
>>> Hi,
>>>
>>> I don't think current Spark Streaming support this feature, all the
>>> DStream lineage is fixed after the context is started.
>>>
>>> Also stopping a stream is not supported, instead currently we need to
>>> stop the whole streaming context to meet what you want.
>>>
>>> Thanks
>>> Saisai
>>>
>>> -Original Message-
>>> From: jamborta [mailto:jambo...@gmail.com]
>>> Sent: Wednesday, January 21, 2015 3:09 AM
>>> To: user@spark.apache.org
>>> Subject: dynamically change receiver for a spark stream
>>>
>>> Hi all,
>>>
>>> we have been trying to setup a stream using a custom receiver that would
>>> pick up data from sql databases. we'd like to keep that stream context
>>> running and dynamically change the streams on demand, adding and removing
>>> streams based on demand. alternativel, if a stream is fixed, is it possible
>>> to stop a stream, change to config and start again?
>>>
>>> thanks,
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For
>>> additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>


Re: dynamically change receiver for a spark stream

2015-01-21 Thread Gerard Maas
One possible workaround could be to orchestrate launch/stopping of
Streaming jobs on demand as long as the number of jobs/streams stay within
the boundaries of the resources (cores) you've available.
e.g. if you're using Mesos, Marathon offers a REST interface to manage job
lifecycle. You will still need to solve the dynamic configuration through
some alternative channel.

On Wed, Jan 21, 2015 at 11:30 AM, Tamas Jambor  wrote:

> thanks for the replies.
>
> is this something we can get around? Tried to hack into the code without
> much success.
>
> On Wed, Jan 21, 2015 at 3:15 AM, Shao, Saisai 
> wrote:
>
>> Hi,
>>
>> I don't think current Spark Streaming support this feature, all the
>> DStream lineage is fixed after the context is started.
>>
>> Also stopping a stream is not supported, instead currently we need to
>> stop the whole streaming context to meet what you want.
>>
>> Thanks
>> Saisai
>>
>> -Original Message-
>> From: jamborta [mailto:jambo...@gmail.com]
>> Sent: Wednesday, January 21, 2015 3:09 AM
>> To: user@spark.apache.org
>> Subject: dynamically change receiver for a spark stream
>>
>> Hi all,
>>
>> we have been trying to setup a stream using a custom receiver that would
>> pick up data from sql databases. we'd like to keep that stream context
>> running and dynamically change the streams on demand, adding and removing
>> streams based on demand. alternativel, if a stream is fixed, is it possible
>> to stop a stream, change to config and start again?
>>
>> thanks,
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
>> commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: dynamically change receiver for a spark stream

2015-01-21 Thread Tamas Jambor
thanks for the replies.

is this something we can get around? Tried to hack into the code without
much success.

On Wed, Jan 21, 2015 at 3:15 AM, Shao, Saisai  wrote:

> Hi,
>
> I don't think current Spark Streaming support this feature, all the
> DStream lineage is fixed after the context is started.
>
> Also stopping a stream is not supported, instead currently we need to stop
> the whole streaming context to meet what you want.
>
> Thanks
> Saisai
>
> -Original Message-
> From: jamborta [mailto:jambo...@gmail.com]
> Sent: Wednesday, January 21, 2015 3:09 AM
> To: user@spark.apache.org
> Subject: dynamically change receiver for a spark stream
>
> Hi all,
>
> we have been trying to setup a stream using a custom receiver that would
> pick up data from sql databases. we'd like to keep that stream context
> running and dynamically change the streams on demand, adding and removing
> streams based on demand. alternativel, if a stream is fixed, is it possible
> to stop a stream, change to config and start again?
>
> thanks,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
> commands, e-mail: user-h...@spark.apache.org
>
>


RE: dynamically change receiver for a spark stream

2015-01-20 Thread Shao, Saisai
Hi,

I don't think current Spark Streaming support this feature, all the DStream 
lineage is fixed after the context is started.

Also stopping a stream is not supported, instead currently we need to stop the 
whole streaming context to meet what you want.

Thanks
Saisai

-Original Message-
From: jamborta [mailto:jambo...@gmail.com] 
Sent: Wednesday, January 21, 2015 3:09 AM
To: user@spark.apache.org
Subject: dynamically change receiver for a spark stream

Hi all,

we have been trying to setup a stream using a custom receiver that would pick 
up data from sql databases. we'd like to keep that stream context running and 
dynamically change the streams on demand, adding and removing streams based on 
demand. alternativel, if a stream is fixed, is it possible to stop a stream, 
change to config and start again? 

thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: dynamically change receiver for a spark stream

2015-01-20 Thread Akhil Das
Can you not do it with RDDs?

Thanks
Best Regards

On Wed, Jan 21, 2015 at 12:38 AM, jamborta  wrote:

> Hi all,
>
> we have been trying to setup a stream using a custom receiver that would
> pick up data from sql databases. we'd like to keep that stream context
> running and dynamically change the streams on demand, adding and removing
> streams based on demand. alternativel, if a stream is fixed, is it possible
> to stop a stream, change to config and start again?
>
> thanks,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/dynamically-change-receiver-for-a-spark-stream-tp21268.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>