[jira] [Commented] (BEAM-6743) Triggers not working for bounded data

2020-06-01 Thread Beam JIRA Bot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-6743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17122909#comment-17122909
 ] 

Beam JIRA Bot commented on BEAM-6743:
-

This issue is P2 but has been unassigned without any comment for 60 days so it 
has been labeled "stale-P2". If this issue is still affecting you, we care! 
Please comment and remove the label. Otherwise, in 14 days the issue will be 
moved to P3.

Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed 
explanation of what these priorities mean.


> Triggers not working for bounded data
> -
>
> Key: BEAM-6743
> URL: https://issues.apache.org/jira/browse/BEAM-6743
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
> Environment: Apache Beam 2.9.0 Java
> Google Cloud Dataflow Runner
>Reporter: Aditya Guru
>Priority: P2
>  Labels: stale-P2
>
> pCollection
>  .apply(Window.into(FixedWindows.of(Duration.millis(100)))
>  .triggering(Repeatedly.forever(AfterPane.elementCountAtLeast(1000)))
>  .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
>  .apply(TextIO.write().withWindowedWrites().withNumShards(1).to('gs-path'));
> Here pCollection is a *bounded* PCollection. I'm trying to break it into 
> files of 1000 roughly, but all I get is 2 files one having 1000 other having 
> the rest of the data.
> If instead I do:-
> pCollection
> .apply(new GlobalWindow())
> .triggering(Repeatedly.forever(AfterPane.elementCountAtLeast(1000)))
> .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
> .apply(TextIO.write().withWindowedWrites().withNumShards(1).to('gs-path'));
> I get just one file. 
> Both of the above cases should have conceptually divided the records into 
> chucks of 1000 to be written in a file.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-6743) Triggers not working for bounded data

2019-02-25 Thread Aditya Guru (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-6743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16776936#comment-16776936
 ] 

Aditya Guru commented on BEAM-6743:
---

Both of the above cases seem like a bug to me. If not then my understanding of 
windows and triggers is at fault. Can someone provide some clarifications?

> Triggers not working for bounded data
> -
>
> Key: BEAM-6743
> URL: https://issues.apache.org/jira/browse/BEAM-6743
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
> Environment: Apache Beam 2.9.0 Java
> Google Cloud Dataflow Runner
>Reporter: Aditya Guru
>Priority: Major
>
> pCollection
>  .apply(Window.into(FixedWindows.of(Duration.millis(100)))
>  .triggering(Repeatedly.forever(AfterPane.elementCountAtLeast(1000)))
>  .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
>  .apply(TextIO.write().withWindowedWrites().withNumShards(1).to('gs-path'));
> Here pCollection is a *bounded* PCollection. I'm trying to break it into 
> files of 1000 roughly, but all I get is 2 files one having 1000 other having 
> the rest of the data.
> If instead I do:-
> pCollection
> .apply(new GlobalWindow())
> .triggering(Repeatedly.forever(AfterPane.elementCountAtLeast(1000)))
> .discardingFiredPanes().withAllowedLateness(Duration.ZERO))
> .apply(TextIO.write().withWindowedWrites().withNumShards(1).to('gs-path'));
> I get just one file. 
> Both of the above cases should have conceptually divided the records into 
> chucks of 1000 to be written in a file.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)