Users have run pipelines with 100s of PCollections.

To my knowledge the default quota is 50 concurrent Dataflow jobs at a given
time which can be increased if you contact Google Cloud Support.

Also, this seems like a Dataflow specific question so feel free to reach
out to StackOverflow tagged with google-cloud-dataflow or
[email protected] with details about the error message and what
your pipeline shape is.

On Mon, Oct 30, 2017 at 2:12 AM, Chaim Turkel <[email protected]> wrote:

> Hi,
>   I have a pipeline that has more that 20 collections. It seems that
> dataflow cannot deploy this pipeline.
> I see that from the code I can create more than one pipeline.
>
> Any one know what the limit is?
> Also if i split it, is there a recommended way as to how (the
> collection have different amount of data)
>
> chaim
>

Reply via email to