I do not suggest you handle this in beam.io.WriteToPubSub. You could change
your pipeline to add one transform to check the message size. If it is
beyond 10 MB, you could use another sink or process the message to reduce
the size.
On Fri, May 24, 2024 at 3:46 AM Nimrod Shory wrote:
> Hello group
Hello group,
I am pretty new to Dataflow and Beam.
I have deployed a Dataflow streaming job using Beam with Python.
The final step of my pipeline is publishing a message to Pub/Sub.
In certain cases the message can become too big for Pub/Sub (larger than
the allowed 10MB) and in that case of failur