Ok pause/continue to throw some challenges.

The implication is to pause gracefully and resume the same' First have a
look at this SPIP of mine

[SPARK-42485] SPIP: Shutting down spark structured streaming when the
streaming process completed current process - ASF JIRA (apache.org)
<https://issues.apache.org/jira/browse/SPARK-42485>

<https://issues.apache.org/jira/browse/SPARK-42485>Then we can assume a
graceful pause/restart

As a suggestion, to implement conditional pausing and resuming, you can
introduce a flag or control signal within your DStream processing logic.
When the condition for pausing is met, the stop() method is called to
temporarily halt message processing. Conversely, when the condition for
resuming is met, the start() method is invoked to re-enable message
consumption.

Let us have a go at it

is_paused = False def process_stream(message): global is_paused if not
is_paused: # Perform processing logic here print(message) # Check for
pausing condition if should_pause(message): is_paused = True stream.stop() #
Check for resuming condition if should_resume() and is_paused: is_paused =
False stream.start() stream = DStream(source)
stream.foreach(process_stream) stream.start()

HTH

Mich Talebzadeh,
Distinguished Technologist, Solutions Architect & Engineer
London
United Kingdom


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Fri, 1 Dec 2023 at 12:56, Saurabh Agrawal (180813)
<saurabh.agraw...@amdocs.com.invalid> wrote:

> Hi Spark Team,
>
> I am using Spark 3.4.0 version in my application which is use to consume
> messages from Kafka topics.
>
> I have below queries:
>
> 1. Does DStream support pause/resume streaming message consumption at
> runtime on particular condition? If yes, please provide details.
>
> 2. I tried to revoke partition from consumer at runtime which cause error.
>
>
>
> *throw new IllegalStateException(s"Previously tracked partitions " +*
>
> *        s"${revokedPartitions.mkString("[", ",", "]")} been revoked by
> Kafka because of consumer " +*
>
> *        s"rebalance. This is mostly due to another stream with same group
> id joined, " +*
>
> *        s"please check if there're different streaming application
> misconfigure to use same " +*
>
> *        s"group id. Fundamentally different stream should use different
> group id")*
>
>
>
>
>
> 3. Does Spark support Blue/Green Deployment. I need to implement
> Blue/Green Deployment scenario with Spark. Facing problem as need to deploy
> both Blue and Green deployment with same consumer-group-id. As I read,
> spark does not support 2 deployment with same consumer group-id, this
> implementation is failing. Please guide how this can be implemented with
> Spark.
>
> 4. Does Spark support Active-Active deployment.
>
>
>
> It will be great if you can reply on above queries please.
>
>
>
> --
>
>
> * Regards,*
>
> *Saurabh Agrawal*
>
> [image: Image]
>
> Software Development Specialist, IPaaS R&D
> [image: A picture containing logoDescription automatically generated]
>
>
>
>
>
> *This email and the information contained herein is proprietary and
> confidential and subject to the Amdocs Email Terms of Service, which you
> may review at* *https://www.amdocs.com/about/email-terms-of-service*
> <https://www.amdocs.com/about/email-terms-of-service>
>

Reply via email to