Something to consider: if you're running in Dataflow, the entire Pubsub
read step becomes a noop [1], and the underlying streaming implementation
itself handles reading from pubsub (either windmill or the streaming
engine).

[1]
https://github.com/apache/beam/blob/master/runners/google-cloud-dataflow-java/src/main/java/org/apache/beam/runners/dataflow/DataflowRunner.java#L373

On Wed, Jan 2, 2019 at 12:11 PM Jeff Klukas <jklu...@mozilla.com> wrote:

> I see that the Beam codebase includes a PubsubGrpcClient, but there
> doesn't appear to be any way to configure PubsubIO to use that client over
> the PubsubJsonClient.
>
> There's even a PubsubIO.Read#withClientFactory, but it's marked as for
> testing only.
>
> Is gRPC support something that's still in development? Or am I missing
> something about how to configure this?
>
> I'm particularly interested in using gRPC due to the message size
> inflation of base64 encoding required for JSON transport. My payloads are
> all below the 10 MB Pubsub limit, but I need to support some near the top
> end of that range that are currently causing errors due to base64 inflation.
>

Reply via email to