Do you know the set of PubSub topics when you launch your pipeline?

On Mon, Dec 28, 2020 at 5:14 AM Arif Alili <[email protected]> wrote:

> Hi all,
>
> I am writing to Elasticsearch using Beam (Google Dataflow). The pipeline
> is ingesting data from PubSub subscription and writing them to
> Elasticsearch index (using this Dataflow Template
> <https://github.com/GoogleCloudPlatform/DataflowTemplates/tree/master/v2/pubsub-to-elasticsearch>).
> This works fine.
>
> What I am trying to do now is listen to multiple pubsub topics and write
> to multiple Elasticsearch indices from one DataFLow job, I want to skip
> creating multiple DataFlow jobs because the number of
> pubsub-to-elasticsearch topics can grow fast in near future.
>
> I am using Beam's ElasticsearchIO to write data from a single PubSub topic
> to Elasticsearch, what I need is to change ElasticsearchIO to write to
> multiple indices.
>
> Is anyone familiar with similar architecture? What's the best approach for
> this scenario?
>
> Best,
> --
> *Arif Alili*
>
> A:
>
> T:
> E:
> I:
>
>
> Pilotenstraat 43 bg
> 1059 CH Amsterdam
> 020 - 6 71 71 71 <+31+20+6+71+71+71>
> [email protected] <[name]@propellor.eu>
> www.propellor.eu
>
>
> <https://www.facebook.com/PropellorEU/>
> <https://www.linkedin.com/company/10870471>
> <https://twitter.com/PropellorEU/>
>

Reply via email to