Hello!

I am wondering about using checkpoints with Beam running on Google
Cloud Dataflow.

The docs indicate that checkpoints are not supported by Google Cloud
Dataflow:  
https://beam.apache.org/documentation/runners/capability-matrix/additional-common-features-not-yet-part-of-the-beam-model/

Is there a recommended approach to handling checkpointing on Google
Cloud Dataflow when using streaming sources like Kinesis and Kafka, so
that a pipeline could be resumed from where it left off if it needs to
be stopped or crashes for some reason?

Thanks!
Will Baker

Reply via email to