Hi all,
I would like to know if Apache Beam has a functionality similar to 
fieldsGrouping in Storm that allows to send records to a specific task/worker 
based on a key.
I know that we can achieve that with a combine/grouByKey operation but that 
implies to add a windowing in our pipeline that we don't want.
I have also tried using a stateful transformation. 
I think that also in that case we should use a windowing, but I see that a job 
with a stateful ParDo operation can be submitted  on Google Dataflow with 
windowing. I don't know if this depends  by lacking of support for stateful 
processing on Dataflow and if I can effetely achieve my goal with this solution.


Thanks in advance for your help

Reply via email to