Can you explain what you mean by worker? While every runner has workers of
course, workers are not part of the programming model.

On Thu, May 23, 2019 at 8:13 PM pasquale.bon...@gmail.com <
pasquale.bon...@gmail.com> wrote:

> Hi all,
> I would like to know if Apache Beam has a functionality similar to
> fieldsGrouping in Storm that allows to send records to a specific
> task/worker based on a key.
> I know that we can achieve that with a combine/grouByKey operation but
> that implies to add a windowing in our pipeline that we don't want.
> I have also tried using a stateful transformation.
> I think that also in that case we should use a windowing, but I see that a
> job with a stateful ParDo operation can be submitted  on Google Dataflow
> with windowing. I don't know if this depends  by lacking of support for
> stateful processing on Dataflow and if I can effetely achieve my goal with
> this solution.
>
>
> Thanks in advance for your help
>
>

Reply via email to