[jira] [Commented] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999734#comment-16999734
 ] 

Kenneth Knowles commented on BEAM-8989:
---

Downgrading to P1/Critical per 
https://beam.apache.org/contribute/jira-priorities/

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Critical
> Fix For: 2.18.0
>
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-8989:
--
Fix Version/s: 2.18.0

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0, 2.18.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Critical
> Fix For: 2.18.0
>
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-8989:
-

Assignee: Udi Meiri  (was: Mikhail Gryzykhin)

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0
>Reporter: Luke Cwik
>Assignee: Udi Meiri
>Priority: Critical
> Fix For: 2.18.0
>
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-8989:
--
Affects Version/s: (was: 2.18.0)

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Critical
> Fix For: 2.18.0
>
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-8989:
--
Status: Open  (was: Triage Needed)

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0, 2.18.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Critical
> Fix For: 2.18.0
>
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Kenneth Knowles (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-8989:
--
Priority: Critical  (was: Blocker)

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0, 2.18.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Critical
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8944) Python SDK harness performance degradation with UnboundedThreadPoolExecutor

2019-12-18 Thread Udi Meiri (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8944?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri updated BEAM-8944:

Fix Version/s: 2.18.0

> Python SDK harness performance degradation with UnboundedThreadPoolExecutor
> ---
>
> Key: BEAM-8944
> URL: https://issues.apache.org/jira/browse/BEAM-8944
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-harness
>Affects Versions: 2.18.0
>Reporter: Yichi Zhang
>Assignee: Yichi Zhang
>Priority: Blocker
> Fix For: 2.18.0
>
> Attachments: profiling.png, profiling_one_thread.png, 
> profiling_twelve_threads.png
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> We are seeing a performance degradation for python streaming word count load 
> tests.
>  
> After some investigation, it appears to be caused by swapping the original 
> ThreadPoolExecutor to UnboundedThreadPoolExecutor in sdk worker. Suspicion is 
> that python performance is worse with more threads on cpu-bounded tasks.
>  
> A simple test for comparing the multiple thread pool executor performance:
>  
> {code:python}
> def test_performance(self):
>    def run_perf(executor):
>      total_number = 100
>      q = queue.Queue()
>     def task(number):
>        hash(number)
>        q.put(number + 200)
>        return number
>     t = time.time()
>      count = 0
>      for i in range(200):
>        q.put(i)
>     while count < total_number:
>        executor.submit(task, q.get(block=True))
>        count += 1
>      print('%s uses %s' % (executor, time.time() - t))
>    with UnboundedThreadPoolExecutor() as executor:
>      run_perf(executor)
>    with futures.ThreadPoolExecutor(max_workers=1) as executor:
>      run_perf(executor)
>    with futures.ThreadPoolExecutor(max_workers=12) as executor:
>      run_perf(executor)
> {code}
> Results:
>  0x7fab400dbe50> uses 268.160675049
>   uses 
> 79.904583931
>   uses 
> 191.179054976
>  ```
> Profiling:
> UnboundedThreadPoolExecutor:
>  !profiling.png! 
> 1 Thread ThreadPoolExecutor:
>  !profiling_one_thread.png! 
> 12 Threads ThreadPoolExecutor:
>  !profiling_twelve_threads.png! 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (BEAM-8991) RuntimeError in log_handler_test

2019-12-18 Thread Valentyn Tymofieiev (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Valentyn Tymofieiev closed BEAM-8991.
-
Fix Version/s: Not applicable
   Resolution: Duplicate

> RuntimeError in log_handler_test
> 
>
> Key: BEAM-8991
> URL: https://issues.apache.org/jira/browse/BEAM-8991
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Ning Kang
>Priority: Major
> Fix For: Not applicable
>
>
> Now is:
> {code:java}
> apache_beam.runners.worker.log_handler_test.FnApiLogRecordHandlerTest.test_exc_info
>  (from py27-gcp-pytest)Failing for the past 1 build (Since #1290 )Took 78 
> ms.Error MessageIndexError: list index out of rangeStacktraceself = 
>  testMethod=test_exc_info>
> def test_exc_info(self):
>   try:
> raise ValueError('some message')
>   except ValueError:
> _LOGGER.error('some error', exc_info=True)
> 
>   self.fn_log_handler.close()
> 
> > log_entry = 
> > self.test_logging_service.log_records_received[0].log_entries[0]
> E IndexError: list index out of range
> apache_beam/runners/worker/log_handler_test.py:110: IndexErrorStandard 
> ErrorERROR:apache_beam.runners.worker.log_handler_test:some error
> Traceback (most recent call last):
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Phrase/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/apache_beam/runners/worker/log_handler_test.py",
>  line 104, in test_exc_info
> raise ValueError('some message')
> ValueError: some message
> {code}
>  Marking it as a duplicate of BEAM-8974.
> Was:
> {code:java}
> 19:28:06 > Task :sdks:python:test-suites:tox:py35:testPy35Cython
> .Exception in thread Thread-1715:
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "apache_beam/runners/common.py", line 879, in 
> apache_beam.runners.common.DoFnRunner.process
> 19:28:06 return self.do_fn_invoker.invoke_process(windowed_value)
> 19:28:06   File "apache_beam/runners/common.py", line 495, in 
> apache_beam.runners.common.SimpleInvoker.invoke_process
> 19:28:06 windowed_value, self.process_method(windowed_value.value))
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/transforms/core.py",
>  line 1434, in 
> 19:28:06 wrapper = lambda x: [fn(x)]
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py",
>  line 620, in raise_error
> 19:28:06 raise RuntimeError('x')
> 19:28:06 RuntimeError: x
> 19:28:06 
> 19:28:06 During handling of the above exception, another exception occurred:
> 19:28:06 
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 914, in 
> _bootstrap_inner
> 19:28:06 self.run()
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 862, in run
> 19:28:06 self._target(*self._args, **self._kwargs)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/local_job_service.py",
>  line 270, in _run_job
> 19:28:06 self._pipeline_proto)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 461, in run_via_runner_api
> 19:28:06 return self.run_stages(stage_context, stages)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 553, in run_stages
> 19:28:06 stage_results.process_bundle.monitoring_infos)
> 19:28:06   File "/usr/lib/python3.5/contextlib.py", line 77, in __exit__
> 19:28:06 self.gen.throw(type, value, traceback)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 500, in maybe_profile
> 19:28:06 yield
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 550, in run_stages
> 19:28:06 stage_context.safe_coders)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 87

[jira] [Updated] (BEAM-8991) RuntimeError in log_handler_test

2019-12-18 Thread Ning Kang (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ning Kang updated BEAM-8991:

Summary: RuntimeError in log_handler_test  (was: RuntimeError in 
fn_api_runner_test.py)

> RuntimeError in log_handler_test
> 
>
> Key: BEAM-8991
> URL: https://issues.apache.org/jira/browse/BEAM-8991
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Ning Kang
>Priority: Major
>
> Now is:
> {code:java}
> apache_beam.runners.worker.log_handler_test.FnApiLogRecordHandlerTest.test_exc_info
>  (from py27-gcp-pytest)Failing for the past 1 build (Since #1290 )Took 78 
> ms.Error MessageIndexError: list index out of rangeStacktraceself = 
>  testMethod=test_exc_info>
> def test_exc_info(self):
>   try:
> raise ValueError('some message')
>   except ValueError:
> _LOGGER.error('some error', exc_info=True)
> 
>   self.fn_log_handler.close()
> 
> > log_entry = 
> > self.test_logging_service.log_records_received[0].log_entries[0]
> E IndexError: list index out of range
> apache_beam/runners/worker/log_handler_test.py:110: IndexErrorStandard 
> ErrorERROR:apache_beam.runners.worker.log_handler_test:some error
> Traceback (most recent call last):
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Phrase/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/apache_beam/runners/worker/log_handler_test.py",
>  line 104, in test_exc_info
> raise ValueError('some message')
> ValueError: some message
> {code}
>  Marking it as a duplicate of BEAM-8974.
> Was:
> {code:java}
> 19:28:06 > Task :sdks:python:test-suites:tox:py35:testPy35Cython
> .Exception in thread Thread-1715:
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "apache_beam/runners/common.py", line 879, in 
> apache_beam.runners.common.DoFnRunner.process
> 19:28:06 return self.do_fn_invoker.invoke_process(windowed_value)
> 19:28:06   File "apache_beam/runners/common.py", line 495, in 
> apache_beam.runners.common.SimpleInvoker.invoke_process
> 19:28:06 windowed_value, self.process_method(windowed_value.value))
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/transforms/core.py",
>  line 1434, in 
> 19:28:06 wrapper = lambda x: [fn(x)]
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py",
>  line 620, in raise_error
> 19:28:06 raise RuntimeError('x')
> 19:28:06 RuntimeError: x
> 19:28:06 
> 19:28:06 During handling of the above exception, another exception occurred:
> 19:28:06 
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 914, in 
> _bootstrap_inner
> 19:28:06 self.run()
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 862, in run
> 19:28:06 self._target(*self._args, **self._kwargs)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/local_job_service.py",
>  line 270, in _run_job
> 19:28:06 self._pipeline_proto)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 461, in run_via_runner_api
> 19:28:06 return self.run_stages(stage_context, stages)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 553, in run_stages
> 19:28:06 stage_results.process_bundle.monitoring_infos)
> 19:28:06   File "/usr/lib/python3.5/contextlib.py", line 77, in __exit__
> 19:28:06 self.gen.throw(type, value, traceback)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 500, in maybe_profile
> 19:28:06 yield
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 550, in run_stages
> 19:28:06 stage_context.safe_coders)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 870, in _run_stage
> 19:28:06 

[jira] [Updated] (BEAM-8991) RuntimeError in fn_api_runner_test.py

2019-12-18 Thread Ning Kang (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ning Kang updated BEAM-8991:

Description: 
Now is:
{code:java}
apache_beam.runners.worker.log_handler_test.FnApiLogRecordHandlerTest.test_exc_info
 (from py27-gcp-pytest)Failing for the past 1 build (Since #1290 )Took 78 
ms.Error MessageIndexError: list index out of rangeStacktraceself = 


def test_exc_info(self):
  try:
raise ValueError('some message')
  except ValueError:
_LOGGER.error('some error', exc_info=True)

  self.fn_log_handler.close()

> log_entry = 
> self.test_logging_service.log_records_received[0].log_entries[0]
E IndexError: list index out of range

apache_beam/runners/worker/log_handler_test.py:110: IndexErrorStandard 
ErrorERROR:apache_beam.runners.worker.log_handler_test:some error
Traceback (most recent call last):
  File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Phrase/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/apache_beam/runners/worker/log_handler_test.py",
 line 104, in test_exc_info
raise ValueError('some message')
ValueError: some message
{code}
 Marking it as a duplicate of BEAM-8974.

Was:
{code:java}
19:28:06 > Task :sdks:python:test-suites:tox:py35:testPy35Cython
.Exception in thread Thread-1715:
19:28:06 Traceback (most recent call last):
19:28:06   File "apache_beam/runners/common.py", line 879, in 
apache_beam.runners.common.DoFnRunner.process
19:28:06 return self.do_fn_invoker.invoke_process(windowed_value)
19:28:06   File "apache_beam/runners/common.py", line 495, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
19:28:06 windowed_value, self.process_method(windowed_value.value))
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/transforms/core.py",
 line 1434, in 
19:28:06 wrapper = lambda x: [fn(x)]
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py",
 line 620, in raise_error
19:28:06 raise RuntimeError('x')
19:28:06 RuntimeError: x
19:28:06 
19:28:06 During handling of the above exception, another exception occurred:
19:28:06 
19:28:06 Traceback (most recent call last):
19:28:06   File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
19:28:06 self.run()
19:28:06   File "/usr/lib/python3.5/threading.py", line 862, in run
19:28:06 self._target(*self._args, **self._kwargs)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/local_job_service.py",
 line 270, in _run_job
19:28:06 self._pipeline_proto)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 461, in run_via_runner_api
19:28:06 return self.run_stages(stage_context, stages)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 553, in run_stages
19:28:06 stage_results.process_bundle.monitoring_infos)
19:28:06   File "/usr/lib/python3.5/contextlib.py", line 77, in __exit__
19:28:06 self.gen.throw(type, value, traceback)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 500, in maybe_profile
19:28:06 yield
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 550, in run_stages
19:28:06 stage_context.safe_coders)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 870, in _run_stage
19:28:06 result, splits = bundle_manager.process_bundle(data_input, 
data_output)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 2052, in process_bundle
19:28:06 part, expected_outputs), part_inputs):
19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 556, in 
result_iterator
19:28:06 yield future.result()
19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 405, in 
result
19:28:06 return

[jira] [Updated] (BEAM-8478) Create WatermarkEstimator for tracking watermark

2019-12-18 Thread Mikhail Gryzykhin (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8478?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Gryzykhin updated BEAM-8478:

Summary: Create WatermarkEstimator for tracking watermark  (was: Create 
WatermarkEstomator for tracking watermark)

> Create WatermarkEstimator for tracking watermark
> 
>
> Key: BEAM-8478
> URL: https://issues.apache.org/jira/browse/BEAM-8478
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core, sdk-py-harness
>Reporter: Boyuan Zhang
>Assignee: Boyuan Zhang
>Priority: Major
> Fix For: 2.17.0
>
>
> Instead of tracking watermark in RestrictionTracker, use a WatermarkEstimator 
> instead.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-5495) PipelineResources algorithm is not working in most environments

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-5495?focusedWorklogId=361403&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361403
 ]

ASF GitHub Bot logged work on BEAM-5495:


Author: ASF GitHub Bot
Created on: 18/Dec/19 23:03
Start Date: 18/Dec/19 23:03
Worklog Time Spent: 10m 
  Work Description: TheNeuralBit commented on pull request #10420: Revert 
"Merge pull request #10268: [BEAM-5495] Adapt PipelineResource…
URL: https://github.com/apache/beam/pull/10420
 
 
   …s to be compatible with Java 11"
   
   This reverts commit e4a3594c69210706c00a0e13f11c2e21000a3cea, reversing
   changes made to 15a3eb06e5fbfebf1592a3440fd2c84359e4a4be.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/)
 | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/jo

[jira] [Updated] (BEAM-8995) apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: the JSON object must be str, not 'bytes'

2019-12-18 Thread Valentyn Tymofieiev (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Valentyn Tymofieiev updated BEAM-8995:
--
Fix Version/s: 2.19.0

> apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: 
> the JSON object must be str, not 'bytes'
> --
>
> Key: BEAM-8995
> URL: https://issues.apache.org/jira/browse/BEAM-8995
> Project: Beam
>  Issue Type: Bug
>  Components: io-py-gcp
>Reporter: Brian Hulette
>Assignee: Kamil Wasilewski
>Priority: Major
> Fix For: 2.19.0
>
>
> Sample failure: https://builds.apache.org/job/beam_PostCommit_Python35/1254/
> (also includes two other failures on the same tests due to BEAM-8988)
> Triggered by https://github.com/apache/beam/pull/9772
> {code}
> Dataflow pipeline failed. State: FAILED, Error:
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 
> 649, in do_work
> work_executor.execute()
>   File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
> line 176, in execute
> op.start()
>   File "dataflow_worker/native_operations.py", line 38, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 39, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 44, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 48, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/concat_source.py", 
> line 86, in read
> range_tracker.sub_range_tracker(source_ix)):
>   File "/usr/local/lib/python3.5/site-packages/apache_beam/io/textio.py", 
> line 206, in read_records
> yield self._coder.decode(record)
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 
> 557, in decode
> value = json.loads(value)
>   File "/usr/local/lib/python3.5/json/__init__.py", line 312, in loads
> s.__class__.__name__))
> TypeError: the JSON object must be str, not 'bytes'
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8995) apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: the JSON object must be str, not 'bytes'

2019-12-18 Thread Valentyn Tymofieiev (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Valentyn Tymofieiev updated BEAM-8995:
--
Status: Open  (was: Triage Needed)

> apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: 
> the JSON object must be str, not 'bytes'
> --
>
> Key: BEAM-8995
> URL: https://issues.apache.org/jira/browse/BEAM-8995
> Project: Beam
>  Issue Type: Bug
>  Components: io-py-gcp
>Reporter: Brian Hulette
>Assignee: Kamil Wasilewski
>Priority: Major
>
> Sample failure: https://builds.apache.org/job/beam_PostCommit_Python35/1254/
> (also includes two other failures on the same tests due to BEAM-8988)
> Triggered by https://github.com/apache/beam/pull/9772
> {code}
> Dataflow pipeline failed. State: FAILED, Error:
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 
> 649, in do_work
> work_executor.execute()
>   File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
> line 176, in execute
> op.start()
>   File "dataflow_worker/native_operations.py", line 38, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 39, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 44, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 48, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/concat_source.py", 
> line 86, in read
> range_tracker.sub_range_tracker(source_ix)):
>   File "/usr/local/lib/python3.5/site-packages/apache_beam/io/textio.py", 
> line 206, in read_records
> yield self._coder.decode(record)
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 
> 557, in decode
> value = json.loads(value)
>   File "/usr/local/lib/python3.5/json/__init__.py", line 312, in loads
> s.__class__.__name__))
> TypeError: the JSON object must be str, not 'bytes'
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8973) Python PreCommit occasionally timesout

2019-12-18 Thread Valentyn Tymofieiev (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8973?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999562#comment-16999562
 ] 

Valentyn Tymofieiev commented on BEAM-8973:
---

This could be due to Parquet IO tests running out of GCE troughput quote as 
[~udim] was investigating.

> Python PreCommit occasionally timesout 
> ---
>
> Key: BEAM-8973
> URL: https://issues.apache.org/jira/browse/BEAM-8973
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core, test-failures
>Reporter: Valentyn Tymofieiev
>Priority: Major
>
> Sample time outs in Cron jobs (~1 out of 10 jobs):
> [https://builds.apache.org/job/beam_PreCommit_Python_Cron/2157/]
> [https://builds.apache.org/job/beam_PreCommit_Python_Cron/2146/]
> In jobs triggered on PRs the error also happened more frequently, example: 
> [https://builds.apache.org/job/beam_PreCommit_Python_Commit/10373/]
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8991) RuntimeError in fn_api_runner_test.py

2019-12-18 Thread Valentyn Tymofieiev (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8991?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999559#comment-16999559
 ] 

Valentyn Tymofieiev commented on BEAM-8991:
---

I think this is a red herring - the test intentionally raises an exception and 
verifies that stage is included in stack trace: 

[https://github.com/apache/beam/blob/14411f41938e6987418516610aa2f92b54e24af1/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py#L616]

 

It would be nice to not pollute the logs with these stacktraces since they are 
intentional.

> RuntimeError in fn_api_runner_test.py
> -
>
> Key: BEAM-8991
> URL: https://issues.apache.org/jira/browse/BEAM-8991
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Ning Kang
>Priority: Major
>
> {code:java}
> 19:28:06 > Task :sdks:python:test-suites:tox:py35:testPy35Cython
> .Exception in thread Thread-1715:
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "apache_beam/runners/common.py", line 879, in 
> apache_beam.runners.common.DoFnRunner.process
> 19:28:06 return self.do_fn_invoker.invoke_process(windowed_value)
> 19:28:06   File "apache_beam/runners/common.py", line 495, in 
> apache_beam.runners.common.SimpleInvoker.invoke_process
> 19:28:06 windowed_value, self.process_method(windowed_value.value))
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/transforms/core.py",
>  line 1434, in 
> 19:28:06 wrapper = lambda x: [fn(x)]
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py",
>  line 620, in raise_error
> 19:28:06 raise RuntimeError('x')
> 19:28:06 RuntimeError: x
> 19:28:06 
> 19:28:06 During handling of the above exception, another exception occurred:
> 19:28:06 
> 19:28:06 Traceback (most recent call last):
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 914, in 
> _bootstrap_inner
> 19:28:06 self.run()
> 19:28:06   File "/usr/lib/python3.5/threading.py", line 862, in run
> 19:28:06 self._target(*self._args, **self._kwargs)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/local_job_service.py",
>  line 270, in _run_job
> 19:28:06 self._pipeline_proto)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 461, in run_via_runner_api
> 19:28:06 return self.run_stages(stage_context, stages)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 553, in run_stages
> 19:28:06 stage_results.process_bundle.monitoring_infos)
> 19:28:06   File "/usr/lib/python3.5/contextlib.py", line 77, in __exit__
> 19:28:06 self.gen.throw(type, value, traceback)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 500, in maybe_profile
> 19:28:06 yield
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 550, in run_stages
> 19:28:06 stage_context.safe_coders)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 870, in _run_stage
> 19:28:06 result, splits = bundle_manager.process_bundle(data_input, 
> data_output)
> 19:28:06   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
>  line 2052, in process_bundle
> 19:28:06 part, expected_outputs), part_inputs):
> 19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 556, 
> in result_iterator
> 19:28:06 yield future.result()
> 19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 405, 
> in result
> 19:28:06 return self.__get_result()
> 19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 357, 
> in __get_result
> 19:28:06 raise self._exception
> 19:28:06   File

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:15 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:07 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Resolved] (BEAM-7970) Regenerate Go SDK proto files in correct version

2019-12-18 Thread Daniel Oliveira (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-7970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Oliveira resolved BEAM-7970.
---
Resolution: Fixed

Think it's really fixed this time. Go protos have been regenerated with a newer 
version of the proto compiler and golang/protobuf.

> Regenerate Go SDK proto files in correct version
> 
>
> Key: BEAM-7970
> URL: https://issues.apache.org/jira/browse/BEAM-7970
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Daniel Oliveira
>Assignee: Daniel Oliveira
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 3h 10m
>  Remaining Estimate: 0h
>
> Generated proto files in the Go SDK currently include this bit:
> {{// This is a compile-time assertion to ensure that this generated file}}
> {{// is compatible with the proto package it is being compiled against.}}
> {{// A compilation error at this line likely means your copy of the}}
> {{// proto package needs to be updated.}}
> {{const _ = proto.ProtoPackageIsVersion2 // please upgrade the proto package}}
>  
> This indicates that the protos are being generated as proto v2 for whatever 
> reason. Most likely, as mentioned by this post with someone with a similar 
> issue, because the proto generation binary needs to be rebuilt before 
> generating the files again: 
> [https://github.com/golang/protobuf/issues/449#issuecomment-340884839]
> This hasn't caused any errors so far, but might eventually cause errors if we 
> hit version differences between the v2 and v3 protos.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:02 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:02 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:01 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 10:00 PM:
--

https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORea

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 9:59 PM:
-


https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIORead

[jira] [Commented] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999527#comment-16999527
 ] 

Tomo Suzuki commented on BEAM-8695:
---



https://builds.apache.org/job/beam_PreCommit_Java_Commit/9288/#showFailuresLink 
{noformat}
org.apache.beam.runners.flink.translation.wrappers.streaming.io.UnboundedSourceWrapperTest$ParameterizedUnboundedSourceWrapperTest.testWatermarkEmission[numTasks
 = 2; numSplits=2]
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.ElementCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MSecMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMSecMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.MeanByteCountMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserDistributionMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.fn.control.UserMonitoringInfoToCounterUpdateTransformerTest.testTransformReturnsValidCounterUpdateWhenValidUserMonitoringInfoReceived
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixUnknownCoders
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodeSuccessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForLengthPrefixCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForSideInputInfos
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixParDoInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixInstructionOutputCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixWriteInstructionCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceUnknownCoder
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixAndReplaceForRunnerNetwork
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixForInstructionOutputNodeWithGrpcNodePredecessor
org.apache.beam.runners.dataflow.worker.graph.LengthPrefixUnknownCodersTest.testLengthPrefixReadInstructionCoder
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest.testEstimatedSizeWithoutStreamingBuffer
org.a

[jira] [Assigned] (BEAM-8995) apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: the JSON object must be str, not 'bytes'

2019-12-18 Thread Valentyn Tymofieiev (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Valentyn Tymofieiev reassigned BEAM-8995:
-

Assignee: Kamil Wasilewski

> apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: 
> the JSON object must be str, not 'bytes'
> --
>
> Key: BEAM-8995
> URL: https://issues.apache.org/jira/browse/BEAM-8995
> Project: Beam
>  Issue Type: Bug
>  Components: io-py-gcp
>Reporter: Brian Hulette
>Assignee: Kamil Wasilewski
>Priority: Major
>
> Sample failure: https://builds.apache.org/job/beam_PostCommit_Python35/1254/
> (also includes two other failures on the same tests due to BEAM-8988)
> Triggered by https://github.com/apache/beam/pull/9772
> {code}
> Dataflow pipeline failed. State: FAILED, Error:
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 
> 649, in do_work
> work_executor.execute()
>   File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
> line 176, in execute
> op.start()
>   File "dataflow_worker/native_operations.py", line 38, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 39, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 44, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 48, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/concat_source.py", 
> line 86, in read
> range_tracker.sub_range_tracker(source_ix)):
>   File "/usr/local/lib/python3.5/site-packages/apache_beam/io/textio.py", 
> line 206, in read_records
> yield self._coder.decode(record)
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 
> 557, in decode
> value = json.loads(value)
>   File "/usr/local/lib/python3.5/json/__init__.py", line 312, in loads
> s.__class__.__name__))
> TypeError: the JSON object must be str, not 'bytes'
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8995) apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: the JSON object must be str, not 'bytes'

2019-12-18 Thread Valentyn Tymofieiev (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999522#comment-16999522
 ] 

Valentyn Tymofieiev commented on BEAM-8995:
---

cc: [~pabloem]

> apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: 
> the JSON object must be str, not 'bytes'
> --
>
> Key: BEAM-8995
> URL: https://issues.apache.org/jira/browse/BEAM-8995
> Project: Beam
>  Issue Type: Bug
>  Components: io-py-gcp
>Reporter: Brian Hulette
>Assignee: Kamil Wasilewski
>Priority: Major
>
> Sample failure: https://builds.apache.org/job/beam_PostCommit_Python35/1254/
> (also includes two other failures on the same tests due to BEAM-8988)
> Triggered by https://github.com/apache/beam/pull/9772
> {code}
> Dataflow pipeline failed. State: FAILED, Error:
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 
> 649, in do_work
> work_executor.execute()
>   File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
> line 176, in execute
> op.start()
>   File "dataflow_worker/native_operations.py", line 38, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 39, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 44, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File "dataflow_worker/native_operations.py", line 48, in 
> dataflow_worker.native_operations.NativeReadOperation.start
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/concat_source.py", 
> line 86, in read
> range_tracker.sub_range_tracker(source_ix)):
>   File "/usr/local/lib/python3.5/site-packages/apache_beam/io/textio.py", 
> line 206, in read_records
> yield self._coder.decode(record)
>   File 
> "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 
> 557, in decode
> value = json.loads(value)
>   File "/usr/local/lib/python3.5/json/__init__.py", line 312, in loads
> s.__class__.__name__))
> TypeError: the JSON object must be str, not 'bytes'
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8974) apache_beam.runners.worker.log_handler_test.FnApiLogRecordHandlerTest.test_exc_info is flaky

2019-12-18 Thread Brian Hulette (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999513#comment-16999513
 ] 

Brian Hulette commented on BEAM-8974:
-

test_context failed in a similar way here: 
https://builds.apache.org/job/beam_PreCommit_Python_Cron/2170/

stacktrace:
{code}
self = 

def test_context(self):
  try:
with statesampler.instruction_id('A'):
  tracker = statesampler.for_test()
  with tracker.scoped_state(NameContext('name', 'tid'), 'stage'):
_LOGGER.info('message a')
with statesampler.instruction_id('B'):
  _LOGGER.info('message b')
_LOGGER.info('message c')

self.fn_log_handler.close()
a, b, c = sum(
[list(logs.log_entries)
>for logs in self.test_logging_service.log_records_received], [])
EValueError: need more than 0 values to unpack

apache_beam/runners/worker/log_handler_test.py:128: ValueError
{code}

> apache_beam.runners.worker.log_handler_test.FnApiLogRecordHandlerTest.test_exc_info
>  is flaky
> 
>
> Key: BEAM-8974
> URL: https://issues.apache.org/jira/browse/BEAM-8974
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-harness
>Reporter: Valentyn Tymofieiev
>Assignee: Robert Bradshaw
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The test is failing at apache_beam/runners/worker/log_handler_test.py:110: 
> IndexError
> Added in https://github.com/apache/beam/pull/10292
> Sample job: [https://builds.apache.org/job/beam_PreCommit_Python_Cron/2160/]
> Console logs:
>  {noformat}
> 06:37:37 === FAILURES 
> ===
> 06:37:37 ___ FnApiLogRecordHandlerTest.test_exc_info 
> 
> 06:37:37 [gw1] linux2 -- Python 2.7.12 
> /home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Cron/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/target/.tox-py27-gcp-pytest/py27-gcp-pytest/bin/python
> 06:37:37
> 06:37:37 self = 
>  testMethod=test_exc_info>
> 06:37:37
> 06:37:37 def test_exc_info(self):
> 06:37:37   try:
> 06:37:37 raise ValueError('some message')
> 06:37:37   except ValueError:
> 06:37:37 _LOGGER.error('some error', exc_info=True)
> 06:37:37
> 06:37:37   self.fn_log_handler.close()
> 06:37:37
> 06:37:37 > log_entry = 
> self.test_logging_service.log_records_received[0].log_entries[0]
> 06:37:37 E IndexError: list index out of range
> 06:37:37
> 06:37:37 apache_beam/runners/worker/log_handler_test.py:110: IndexError
> 06:37:37 - Captured stderr call 
> -
> 06:37:37 ERROR:apache_beam.runners.worker.log_handler_test:some error
> 06:37:37 Traceback (most recent call last):
> 06:37:37   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Cron/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/apache_beam/runners/worker/log_handler_test.py",
>  line 104, in test_exc_info
> 06:37:37 raise ValueError('some message')
> 06:37:37 ValueError: some message
> 06:37:37 -- Captured log call 
> ---
> 06:37:37 ERROR
> apache_beam.runners.worker.log_handler_test:log_handler_test.py:106 some error
> 06:37:37 Traceback (most recent call last):
> 06:37:37   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Cron/src/sdks/python/test-suites/tox/py2/build/srcs/sdks/python/apache_beam/runners/worker/log_handler_test.py",
>  line 104, in test_exc_info
> 06:37:37 raise ValueError('some message')
> 06:37:37 ValueError: some message
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-3523) Attach Python logging messages to namespaced logger

2019-12-18 Thread Pablo Estrada (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-3523?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999510#comment-16999510
 ] 

Pablo Estrada commented on BEAM-3523:
-

I've started working on thjis

> Attach Python logging messages to namespaced logger
> ---
>
> Key: BEAM-3523
> URL: https://issues.apache.org/jira/browse/BEAM-3523
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Alex Milstead
>Priority: Minor
>
> The python SDK currently uses {{logging.(info|error|debug|etc)}} for log 
> messages. This can be disruptive or unexpected when integrating the SDK into 
> existing applications.
> I would like to request updating the SDK to enforce automatic module based 
> namespaces in python code (i.e. {{logger = logging.getLogger(__name__)}}) so 
> that all {{apache_beam}} output can be controlled by an integrating 
> application without the need to modify the root logging configuration.
> I'd be happy to submit a PR for this myself.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (BEAM-8995) apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 PC with: TypeError: the JSON object must be str, not 'bytes'

2019-12-18 Thread Brian Hulette (Jira)
Brian Hulette created BEAM-8995:
---

 Summary: apache_beam.io.gcp.bigquery_read_it_test failing on Py3.5 
PC with: TypeError: the JSON object must be str, not 'bytes'
 Key: BEAM-8995
 URL: https://issues.apache.org/jira/browse/BEAM-8995
 Project: Beam
  Issue Type: Bug
  Components: io-py-gcp
Reporter: Brian Hulette


Sample failure: https://builds.apache.org/job/beam_PostCommit_Python35/1254/
(also includes two other failures on the same tests due to BEAM-8988)

Triggered by https://github.com/apache/beam/pull/9772

{code}
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
line 176, in execute
op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 48, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/concat_source.py", line 
86, in read
range_tracker.sub_range_tracker(source_ix)):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/textio.py", line 
206, in read_records
yield self._coder.decode(record)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", 
line 557, in decode
value = json.loads(value)
  File "/usr/local/lib/python3.5/json/__init__.py", line 312, in loads
s.__class__.__name__))
TypeError: the JSON object must be str, not 'bytes'
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work started] (BEAM-8933) BigQuery IO should support read/write in Arrow format

2019-12-18 Thread Kirill Kozlov (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8933?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on BEAM-8933 started by Kirill Kozlov.
---
> BigQuery IO should support read/write in Arrow format
> -
>
> Key: BEAM-8933
> URL: https://issues.apache.org/jira/browse/BEAM-8933
> Project: Beam
>  Issue Type: Improvement
>  Components: io-java-gcp
>Reporter: Kirill Kozlov
>Assignee: Kirill Kozlov
>Priority: Major
>  Time Spent: 3h 40m
>  Remaining Estimate: 0h
>
> As of right now BigQuery uses Avro format for reading and writing.
> We should add a config to BigQueryIO to specify which format to use: Arrow or 
> Avro (with Avro as default).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (BEAM-8994) add goVet to goPreCommit

2019-12-18 Thread Robert Burke (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Burke reassigned BEAM-8994:
--

Assignee: (was: Robert Burke)

> add goVet to goPreCommit
> 
>
> Key: BEAM-8994
> URL: https://issues.apache.org/jira/browse/BEAM-8994
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Reporter: Udi Meiri
>Priority: Major
>
> Running all "goVet" tasks in goPreCommit will catch missing dependencies in 
> the gogradle lock file.
> For example: https://issues.apache.org/jira/browse/BEAM-8992



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999450#comment-16999450
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 8:11 PM:
-

{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


PubsubGrpcClientTest failed:

{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}

That was discrepancy between grpc and opencensus.


{noformat}
BigQueryIOReadTest.testEstimatedSizeWithoutStreamingBuffer
...
assertEquals(118, bqSource.getEstimatedSizeBytes(options)); // 276
{noformat}

This seems message byte count has changed due to the dependencies.



was (Author: suztomo):
{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


PubsubGrpcClientTest failed:

{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}

That was discrepancy between grpc and opencensus

[jira] [Work logged] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8695?focusedWorklogId=361395&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361395
 ]

ASF GitHub Bot logged work on BEAM-8695:


Author: ASF GitHub Bot
Created on: 18/Dec/19 20:06
Start Date: 18/Dec/19 20:06
Worklog Time Spent: 10m 
  Work Description: suztomo commented on pull request #10414: 
[BEAM-8695][Not for merge yet] sdks/java: google-http-client 1.34.0
URL: https://github.com/apache/beam/pull/10414
 
 
   https://issues.apache.org/jira/browse/BEAM-8695.
   
   Upgrading google-http-client version to the latest 1.34.0.
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastC

[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999450#comment-16999450
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 7:57 PM:
-

{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


PubsubGrpcClientTest failed:

{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}

That was discrepancy between grpc and opencensus.


{noformat}
BigQueryIOReadTest.testEstimatedSizeWithoutStreamingBuffer
...
assertEquals(118, bqSource.getEstimatedSizeBytes(options)); // 276
{noformat}




was (Author: suztomo):

{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


PubsubGrpcClientTest failed:

{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}




> Beam Dependency Update Request: com.google.http-client:google-http-client
> -

[jira] [Created] (BEAM-8994) add goVet to goPreCommit

2019-12-18 Thread Udi Meiri (Jira)
Udi Meiri created BEAM-8994:
---

 Summary: add goVet to goPreCommit
 Key: BEAM-8994
 URL: https://issues.apache.org/jira/browse/BEAM-8994
 Project: Beam
  Issue Type: Bug
  Components: sdk-go
Reporter: Udi Meiri
Assignee: Robert Burke


Running all "goVet" tasks in goPreCommit will catch missing dependencies in the 
gogradle lock file.

For example: https://issues.apache.org/jira/browse/BEAM-8992



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-8993) [SQL] MongoDb should use predicate push-down

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8993?focusedWorklogId=361392&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361392
 ]

ASF GitHub Bot logged work on BEAM-8993:


Author: ASF GitHub Bot
Created on: 18/Dec/19 19:35
Start Date: 18/Dec/19 19:35
Worklog Time Spent: 10m 
  Work Description: 11moon11 commented on pull request #10417: [BEAM-8993] 
[SQL] MongoDB predicate push down.
URL: https://github.com/apache/beam/pull/10417
 
 
   - Add a `MongoDbFilter` class, implementing `BeamSqlTableFilter`.
 - Support simple comparison operations.
 - Support boolean field.
 - Support nested conjunction/disjunction.
   - Update `MongoDbTable#buildIOReader`
 - Construct a push-down filter from `RexNodes`.
 - Set filter to `FindQuery`.
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBu

[jira] [Commented] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999450#comment-16999450
 ] 

Tomo Suzuki commented on BEAM-8695:
---



{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}




> Beam Dependency Update Request: com.google.http-client:google-http-client
> -
>
> Key: BEAM-8695
> URL: https://issues.apache.org/jira/browse/BEAM-8695
> Project: Beam
>  Issue Type: Sub-task
>  Components: dependencies
>Reporter: Beam JIRA Bot
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
>  - 2019-11-15 19:40:13.570557 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-11-19 21:06:20.477284 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-12-02 12:12:12.146269 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-12-09 12:11:24.693912 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999450#comment-16999450
 ] 

Tomo Suzuki edited comment on BEAM-8695 at 12/18/19 7:30 PM:
-


{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


PubsubGrpcClientTest failed:

{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}





was (Author: suztomo):


{noformat}
> Task :sdks:java:io:google-cloud-platform:test

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > publishOneMessage 
FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:192

org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClientTest > pullOneMessage FAILED
java.lang.IllegalAccessError at PubsubGrpcClientTest.java:142

org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithoutStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:630


org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadTest > 
testEstimatedSizeWithStreamingBuffer FAILED
java.lang.AssertionError at BigQueryIOReadTest.java:664

org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtilTest > testInsertAll FAILED
java.lang.AssertionError at BigQueryUtilTest.java:212
{noformat}


{noformat}
tried to access field io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY 
from class io.grpc.internal.CensusTracingModule$TracingClientInterceptor
java.lang.IllegalAccessError: tried to access field 
io.opencensus.trace.unsafe.ContextUtils.CONTEXT_SPAN_KEY from class 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor.interceptCall(CensusTracingModule.java:384)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor.interceptCall(CensusStatsModule.java:685)
at 
io.grpc.ClientInterceptors$InterceptorChannel.newCall(ClientInterceptors.java:156)
at 
io.grpc.internal.ManagedChannelImpl.newCall(ManagedChannelImpl.java:766)
at 
io.grpc.internal.ForwardingManagedChannel.newCall(ForwardingManagedChannel.java:63)
...
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
{noformat}




> Beam Dependency Update Request: com.google.http-client:google-http-client
> -
>
> Key: BEAM-8695
> URL: https://issues.apache.org/jira/browse/BEAM-8695
> Project: Beam
>  Issue Type: Sub-task
>  Components: depend

[jira] [Updated] (BEAM-8993) [SQL] MongoDb should use predicate push-down

2019-12-18 Thread Kirill Kozlov (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kirill Kozlov updated BEAM-8993:

Description: 
* Add a MongoDbFilter class, implementing BeamSqlTableFilter.
 ** Support simple comparison operations.
 ** Support boolean field.
 ** Support nested conjunction/disjunction.

 * Update MongoDbTable#buildIOReader
 ** Construct a push-down filter from RexNodes.
 ** Set filter to FindQuery.

  was:
* Add a MongoDbFilter class, implementing BeamSqlTableFilter.

 ** Support simple comparison operations.
 ** Support boolean field.
 ** Support nested conjunction/disjunction.
 * Update MongoDbTable#buildIOReader
 ** Construct a push-down filter from RexNodes.
 ** Set filter to FindQuery.


> [SQL] MongoDb should use predicate push-down
> 
>
> Key: BEAM-8993
> URL: https://issues.apache.org/jira/browse/BEAM-8993
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Kirill Kozlov
>Assignee: Kirill Kozlov
>Priority: Major
>
> * Add a MongoDbFilter class, implementing BeamSqlTableFilter.
>  ** Support simple comparison operations.
>  ** Support boolean field.
>  ** Support nested conjunction/disjunction.
>  * Update MongoDbTable#buildIOReader
>  ** Construct a push-down filter from RexNodes.
>  ** Set filter to FindQuery.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8993) [SQL] MongoDb should use predicate push-down

2019-12-18 Thread Kirill Kozlov (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kirill Kozlov updated BEAM-8993:

Status: Open  (was: Triage Needed)

> [SQL] MongoDb should use predicate push-down
> 
>
> Key: BEAM-8993
> URL: https://issues.apache.org/jira/browse/BEAM-8993
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Kirill Kozlov
>Assignee: Kirill Kozlov
>Priority: Major
>
> * Add a MongoDbFilter class, implementing BeamSqlTableFilter.
>  ** Support simple comparison operations.
>  ** Support boolean field.
>  ** Support nested conjunction/disjunction.
>  * Update MongoDbTable#buildIOReader
>  ** Construct a push-down filter from RexNodes.
>  ** Set filter to FindQuery.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (BEAM-8993) [SQL] MongoDb should use predicate push-down

2019-12-18 Thread Kirill Kozlov (Jira)
Kirill Kozlov created BEAM-8993:
---

 Summary: [SQL] MongoDb should use predicate push-down
 Key: BEAM-8993
 URL: https://issues.apache.org/jira/browse/BEAM-8993
 Project: Beam
  Issue Type: Improvement
  Components: dsl-sql
Reporter: Kirill Kozlov
Assignee: Kirill Kozlov


* Add a MongoDbFilter class, implementing BeamSqlTableFilter.

 ** Support simple comparison operations.
 ** Support boolean field.
 ** Support nested conjunction/disjunction.
 * Update MongoDbTable#buildIOReader
 ** Construct a push-down filter from RexNodes.
 ** Set filter to FindQuery.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (BEAM-8664) [SQL] MongoDb should use project push-down

2019-12-18 Thread Kirill Kozlov (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kirill Kozlov resolved BEAM-8664.
-
Fix Version/s: 2.18.0
   Resolution: Fixed

> [SQL] MongoDb should use project push-down
> --
>
> Key: BEAM-8664
> URL: https://issues.apache.org/jira/browse/BEAM-8664
> Project: Beam
>  Issue Type: Improvement
>  Components: dsl-sql
>Reporter: Kirill Kozlov
>Assignee: Kirill Kozlov
>Priority: Major
> Fix For: 2.18.0
>
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
> MongoDbTable should implement the following methods:
> {code:java}
> public PCollection buildIOReader(
> PBegin begin, BeamSqlTableFilter filters, List fieldNames);
> public ProjectSupport supportsProjects();
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8992) ./gradlew :sdks:go:examples:goVet fails

2019-12-18 Thread Udi Meiri (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Udi Meiri updated BEAM-8992:

Fix Version/s: 2.19.0

> ./gradlew :sdks:go:examples:goVet fails
> ---
>
> Key: BEAM-8992
> URL: https://issues.apache.org/jira/browse/BEAM-8992
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: 2.19.0
>Reporter: Udi Meiri
>Assignee: Robert Burke
>Priority: Major
> Fix For: 2.19.0
>
>
> {code}
> > Task :sdks:go:examples:resolveBuildDependencies
> Resolving 
> ./github.com/apache/beam/sdks/go@/usr/local/google/home/ehudm/src/beam-release/sdks/go
> .gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/apache/beam/sdks/go/pkg/beam/io/avroio/avroio.go:28:2:
>  cannot find package "github.com/linkedin/goavro" in any of:
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/linkedin/goavro
>  (vendor tree)
> 
> /usr/local/google/home/ehudm/.gradle/go/binary/1.12/go/src/github.com/linkedin/goavro
>  (from $GOROOT)
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/linkedin/goavro
>  (from $GOPATH)
> > Task :sdks:go:examples:goVet FAILED
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work started] (BEAM-8992) ./gradlew :sdks:go:examples:goVet fails

2019-12-18 Thread Robert Burke (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on BEAM-8992 started by Robert Burke.
--
> ./gradlew :sdks:go:examples:goVet fails
> ---
>
> Key: BEAM-8992
> URL: https://issues.apache.org/jira/browse/BEAM-8992
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: 2.19.0
>Reporter: Udi Meiri
>Assignee: Robert Burke
>Priority: Major
>
> {code}
> > Task :sdks:go:examples:resolveBuildDependencies
> Resolving 
> ./github.com/apache/beam/sdks/go@/usr/local/google/home/ehudm/src/beam-release/sdks/go
> .gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/apache/beam/sdks/go/pkg/beam/io/avroio/avroio.go:28:2:
>  cannot find package "github.com/linkedin/goavro" in any of:
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/linkedin/goavro
>  (vendor tree)
> 
> /usr/local/google/home/ehudm/.gradle/go/binary/1.12/go/src/github.com/linkedin/goavro
>  (from $GOROOT)
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/linkedin/goavro
>  (from $GOPATH)
> > Task :sdks:go:examples:goVet FAILED
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8992) ./gradlew :sdks:go:examples:goVet fails

2019-12-18 Thread Robert Burke (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Burke updated BEAM-8992:
---
Status: Open  (was: Triage Needed)

> ./gradlew :sdks:go:examples:goVet fails
> ---
>
> Key: BEAM-8992
> URL: https://issues.apache.org/jira/browse/BEAM-8992
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: 2.19.0
>Reporter: Udi Meiri
>Assignee: Robert Burke
>Priority: Major
>
> {code}
> > Task :sdks:go:examples:resolveBuildDependencies
> Resolving 
> ./github.com/apache/beam/sdks/go@/usr/local/google/home/ehudm/src/beam-release/sdks/go
> .gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/apache/beam/sdks/go/pkg/beam/io/avroio/avroio.go:28:2:
>  cannot find package "github.com/linkedin/goavro" in any of:
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/linkedin/goavro
>  (vendor tree)
> 
> /usr/local/google/home/ehudm/.gradle/go/binary/1.12/go/src/github.com/linkedin/goavro
>  (from $GOROOT)
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/linkedin/goavro
>  (from $GOPATH)
> > Task :sdks:go:examples:goVet FAILED
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8992) ./gradlew :sdks:go:examples:goVet fails

2019-12-18 Thread Robert Burke (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999440#comment-16999440
 ] 

Robert Burke commented on BEAM-8992:


At somepoint someone added the dependency for an avroIO and then never added 
the dep to the lockfile, which didn't fail before since nothing was ever 
building those the IOs before.

But then goVet was made a failable check, which results in this error. 

We need to update

[https://github.com/apache/beam/blob/master/sdks/go/gogradle.lock]

with the right commit information.



> ./gradlew :sdks:go:examples:goVet fails
> ---
>
> Key: BEAM-8992
> URL: https://issues.apache.org/jira/browse/BEAM-8992
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: 2.19.0
>Reporter: Udi Meiri
>Assignee: Robert Burke
>Priority: Major
>
> {code}
> > Task :sdks:go:examples:resolveBuildDependencies
> Resolving 
> ./github.com/apache/beam/sdks/go@/usr/local/google/home/ehudm/src/beam-release/sdks/go
> .gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/apache/beam/sdks/go/pkg/beam/io/avroio/avroio.go:28:2:
>  cannot find package "github.com/linkedin/goavro" in any of:
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/linkedin/goavro
>  (vendor tree)
> 
> /usr/local/google/home/ehudm/.gradle/go/binary/1.12/go/src/github.com/linkedin/goavro
>  (from $GOROOT)
> 
> /usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/linkedin/goavro
>  (from $GOPATH)
> > Task :sdks:go:examples:goVet FAILED
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (BEAM-8992) ./gradlew :sdks:go:examples:goVet fails

2019-12-18 Thread Udi Meiri (Jira)
Udi Meiri created BEAM-8992:
---

 Summary: ./gradlew :sdks:go:examples:goVet fails
 Key: BEAM-8992
 URL: https://issues.apache.org/jira/browse/BEAM-8992
 Project: Beam
  Issue Type: Bug
  Components: sdk-go
Affects Versions: 2.19.0
Reporter: Udi Meiri
Assignee: Robert Burke


{code}
> Task :sdks:go:examples:resolveBuildDependencies
Resolving 
./github.com/apache/beam/sdks/go@/usr/local/google/home/ehudm/src/beam-release/sdks/go
.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/apache/beam/sdks/go/pkg/beam/io/avroio/avroio.go:28:2:
 cannot find package "github.com/linkedin/goavro" in any of:

/usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/examples/vendor/github.com/linkedin/goavro
 (vendor tree)

/usr/local/google/home/ehudm/.gradle/go/binary/1.12/go/src/github.com/linkedin/goavro
 (from $GOROOT)

/usr/local/google/home/ehudm/src/beam-release/sdks/go/examples/.gogradle/project_gopath/src/github.com/linkedin/goavro
 (from $GOPATH)

> Task :sdks:go:examples:goVet FAILED
{code}




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (BEAM-8898) Enable WriteToBigQuery to perform range partitioning

2019-12-18 Thread Saman Vaisipour (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999435#comment-16999435
 ] 

Saman Vaisipour edited comment on BEAM-8898 at 12/18/19 7:01 PM:
-

Thanks Pablo, looking forward to using this feature in new BQ sink. FYI, the 
BigQuery team has announced the beta launch of integer range partitioning:

[https://cloud.google.com/bigquery/docs/creating-integer-range-partitions]


was (Author: samanvp):
Thanks Pablo, looking forward to using this feature in new BQ sink. FYI, 

The BigQuery team has announced the beta launch of integer range partitioning:

[https://cloud.google.com/bigquery/docs/creating-integer-range-partitions]

> Enable WriteToBigQuery to perform range partitioning
> 
>
> Key: BEAM-8898
> URL: https://issues.apache.org/jira/browse/BEAM-8898
> Project: Beam
>  Issue Type: New Feature
>  Components: io-py-gcp
>Reporter: Saman Vaisipour
>Priority: Minor
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> BigQuery team recently [released 
> 1.22.0|https://github.com/googleapis/google-cloud-python/releases/tag/bigquery-1.22.0]
>  which includes the range partitioning feature (here is [an 
> example|https://github.com/googleapis/google-cloud-python/blob/c4a69d44ccea9635b3d9d316b3f545f16538dafe/bigquery/samples/create_table_range_partitioned.py]).
>  
> WriteToBigQuery uses 
> [`additional_bq_parameters`|https://github.com/apache/beam/blob/c1719476b74ec6f68fabea392087607adafc70ef/sdks/python/apache_beam/io/gcp/bigquery.py#L177]
>  to create tables with date partitioning and clustering. It would be great if 
> the same would be possible to create tables with range partitioning.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8898) Enable WriteToBigQuery to perform range partitioning

2019-12-18 Thread Saman Vaisipour (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999435#comment-16999435
 ] 

Saman Vaisipour commented on BEAM-8898:
---

Thanks Pablo, looking forward to using this feature in new BQ sink. FYI, 

The BigQuery team has announced the beta launch of integer range partitioning:

[https://cloud.google.com/bigquery/docs/creating-integer-range-partitions]

> Enable WriteToBigQuery to perform range partitioning
> 
>
> Key: BEAM-8898
> URL: https://issues.apache.org/jira/browse/BEAM-8898
> Project: Beam
>  Issue Type: New Feature
>  Components: io-py-gcp
>Reporter: Saman Vaisipour
>Priority: Minor
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> BigQuery team recently [released 
> 1.22.0|https://github.com/googleapis/google-cloud-python/releases/tag/bigquery-1.22.0]
>  which includes the range partitioning feature (here is [an 
> example|https://github.com/googleapis/google-cloud-python/blob/c4a69d44ccea9635b3d9d316b3f545f16538dafe/bigquery/samples/create_table_range_partitioned.py]).
>  
> WriteToBigQuery uses 
> [`additional_bq_parameters`|https://github.com/apache/beam/blob/c1719476b74ec6f68fabea392087607adafc70ef/sdks/python/apache_beam/io/gcp/bigquery.py#L177]
>  to create tables with date partitioning and clustering. It would be great if 
> the same would be possible to create tables with range partitioning.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (BEAM-8991) RuntimeError in fn_api_runner_test.py

2019-12-18 Thread Ning Kang (Jira)
Ning Kang created BEAM-8991:
---

 Summary: RuntimeError in fn_api_runner_test.py
 Key: BEAM-8991
 URL: https://issues.apache.org/jira/browse/BEAM-8991
 Project: Beam
  Issue Type: Bug
  Components: sdk-py-core
Reporter: Ning Kang


{code:java}
19:28:06 > Task :sdks:python:test-suites:tox:py35:testPy35Cython
.Exception in thread Thread-1715:
19:28:06 Traceback (most recent call last):
19:28:06   File "apache_beam/runners/common.py", line 879, in 
apache_beam.runners.common.DoFnRunner.process
19:28:06 return self.do_fn_invoker.invoke_process(windowed_value)
19:28:06   File "apache_beam/runners/common.py", line 495, in 
apache_beam.runners.common.SimpleInvoker.invoke_process
19:28:06 windowed_value, self.process_method(windowed_value.value))
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/transforms/core.py",
 line 1434, in 
19:28:06 wrapper = lambda x: [fn(x)]
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner_test.py",
 line 620, in raise_error
19:28:06 raise RuntimeError('x')
19:28:06 RuntimeError: x
19:28:06 
19:28:06 During handling of the above exception, another exception occurred:
19:28:06 
19:28:06 Traceback (most recent call last):
19:28:06   File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
19:28:06 self.run()
19:28:06   File "/usr/lib/python3.5/threading.py", line 862, in run
19:28:06 self._target(*self._args, **self._kwargs)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/local_job_service.py",
 line 270, in _run_job
19:28:06 self._pipeline_proto)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 461, in run_via_runner_api
19:28:06 return self.run_stages(stage_context, stages)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 553, in run_stages
19:28:06 stage_results.process_bundle.monitoring_infos)
19:28:06   File "/usr/lib/python3.5/contextlib.py", line 77, in __exit__
19:28:06 self.gen.throw(type, value, traceback)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 500, in maybe_profile
19:28:06 yield
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 550, in run_stages
19:28:06 stage_context.safe_coders)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 870, in _run_stage
19:28:06 result, splits = bundle_manager.process_bundle(data_input, 
data_output)
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 2052, in process_bundle
19:28:06 part, expected_outputs), part_inputs):
19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 556, in 
result_iterator
19:28:06 yield future.result()
19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 405, in 
result
19:28:06 return self.__get_result()
19:28:06   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 357, in 
__get_result
19:28:06 raise self._exception
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/utils/thread_pool_executor.py",
 line 42, in run
19:28:06 self._future.set_result(self._fn(*self._fn_args, 
**self._fn_kwargs))
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 2052, in 
19:28:06 part, expected_outputs), part_inputs):
19:28:06   File 
"/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_Commit@2/src/sdks/python/test-suites/tox/py35/build/srcs/sdks/python/apache_beam/runners/portability/fn_api_runner.py",
 line 1977, in process_bundle
19:28:

[jira] [Created] (BEAM-8990) Missing required prop from BigQueryIO javadoc

2019-12-18 Thread Reza ardeshir rokni (Jira)
Reza ardeshir rokni created BEAM-8990:
-

 Summary: Missing required prop from BigQueryIO javadoc
 Key: BEAM-8990
 URL: https://issues.apache.org/jira/browse/BEAM-8990
 Project: Beam
  Issue Type: Bug
  Components: io-java-gcp
Reporter: Reza ardeshir rokni


withNumFileShards is required parameter for BigQueryIO using FILE LOAD option. 
This is missing from javadoc for File Loads

 

https://beam.apache.org/releases/javadoc/2.16.0/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.Write.Method.html#FILE_LOADS



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (BEAM-8810) Dataflow runner - Work stuck in state COMMITTING with streaming commit rpcs

2019-12-18 Thread Sam Whittle (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8810?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sam Whittle resolved BEAM-8810.
---
Fix Version/s: 2.19.0
   Resolution: Fixed

> Dataflow runner - Work stuck in state COMMITTING with streaming commit rpcs
> ---
>
> Key: BEAM-8810
> URL: https://issues.apache.org/jira/browse/BEAM-8810
> Project: Beam
>  Issue Type: Bug
>  Components: runner-dataflow
>Reporter: Sam Whittle
>Assignee: Sam Whittle
>Priority: Minor
> Fix For: 2.19.0
>
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> In several pipelines using streaming engine and thus the streaming commit 
> rpcs, work became stuck in state COMMITTING indefinitely.  Such stuckness 
> coincided with repeated streaming rpc failures.
> The status page shows that the key has work in state COMMITTING, and has 1 
> queued work item.
> There is a single active commit stream, with 0 pending requests.
> The stream could exist past the stream deadline because the StreamCache only 
> closes stream due to the deadline when a stream is retrieved, which only 
> occurs if there are other commits.  Since the pipeline is stuck due to this 
> event, there are no other commits.
> It seems therefore there is some race on the commitStream between onNewStream 
> and commitWork that either prevents work from being retried, an exception 
> that triggers between when the pending request is removed and the callback is 
> called, or some potential corruption of the activeWork data structure. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Backwards incompatible change in ParDo.getSideInputs (caught by failure when running Apache Nemo quickstart)

2019-12-18 Thread Luke Cwik (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik updated BEAM-8989:

Summary: Backwards incompatible change in ParDo.getSideInputs (caught by 
failure when running Apache Nemo quickstart)  (was: Apache Nemo quickstart 
broken for Apache Beam 2.16 and 2.17 due to backwards incompatible change in 
ParDo.getSideInputs)

> Backwards incompatible change in ParDo.getSideInputs (caught by failure when 
> running Apache Nemo quickstart)
> 
>
> Key: BEAM-8989
> URL: https://issues.apache.org/jira/browse/BEAM-8989
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.16.0, 2.17.0, 2.18.0
>Reporter: Luke Cwik
>Assignee: Mikhail Gryzykhin
>Priority: Blocker
>
> [PR/9275|https://github.com/apache/beam/pull/9275] changed 
> *ParDo.getSideInputs* from *List* to *Map PCollectionView>* which is backwards incompatible change and was released as 
> part of Beam 2.16.0 erroneously.
> Running the Apache Nemo Quickstart fails with:
>  
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: Translator private 
> static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
> "main" java.lang.RuntimeException: Translator private static void 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
>  have failed to translate 
> org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
>  at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
>  at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
> org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
> org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
> org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
>  ... 14 moreCaused by: java.lang.NoSuchMethodError: 
> org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
>  at 
> org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
>  ... 19 more{code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8989) Apache Nemo quickstart broken for Apache Beam 2.16 and 2.17 due to backwards incompatible change in ParDo.getSideInputs

2019-12-18 Thread Luke Cwik (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik updated BEAM-8989:

Description: 
[PR/9275|https://github.com/apache/beam/pull/9275] changed 
*ParDo.getSideInputs* from *List* to *Map* which is backwards incompatible change and was released as 
part of Beam 2.16.0 erroneously.

Running the Apache Nemo Quickstart fails with:

 
{code:java}
Exception in thread "main" java.lang.RuntimeException: Translator private 
static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
"main" java.lang.RuntimeException: Translator private static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
 at 
org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
 at 
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
 at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
java.lang.reflect.InvocationTargetException at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
 ... 14 moreCaused by: java.lang.NoSuchMethodError: 
org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
 at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
 ... 19 more{code}
 

  was:
This changed `ParDo.getSideInputs` from `List` to `Map` which is backwards incompatible change and was released as 
part of Beam 2.16.0 erroneously.

```

Exception in thread "main" java.lang.RuntimeException: Translator private 
static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
"main" java.lang.RuntimeException: Translator private static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
 at 
org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
 at 

[jira] [Created] (BEAM-8989) Apache Nemo quickstart broken for Apache Beam 2.16 and 2.17 due to backwards incompatible change in ParDo.getSideInputs

2019-12-18 Thread Luke Cwik (Jira)
Luke Cwik created BEAM-8989:
---

 Summary: Apache Nemo quickstart broken for Apache Beam 2.16 and 
2.17 due to backwards incompatible change in ParDo.getSideInputs
 Key: BEAM-8989
 URL: https://issues.apache.org/jira/browse/BEAM-8989
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Affects Versions: 2.16.0, 2.17.0, 2.18.0
Reporter: Luke Cwik
Assignee: Mikhail Gryzykhin


This changed `ParDo.getSideInputs` from `List` to `Map` which is backwards incompatible change and was released as 
part of Beam 2.16.0 erroneously.

```

Exception in thread "main" java.lang.RuntimeException: Translator private 
static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27Exception in thread 
"main" java.lang.RuntimeException: Translator private static void 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(org.apache.nemo.compiler.frontend.beam.PipelineTranslationContext,org.apache.beam.sdk.runners.TransformHierarchy$Node,org.apache.beam.sdk.transforms.ParDo$MultiOutput)
 have failed to translate 
org.apache.beam.examples.WordCount$ExtractWordsFn@600b9d27 at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:113)
 at 
org.apache.nemo.compiler.frontend.beam.PipelineVisitor.visitPrimitiveTransform(PipelineVisitor.java:46)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
 at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
 at 
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
 at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460) at 
org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:80) at 
org.apache.nemo.compiler.frontend.beam.NemoRunner.run(NemoRunner.java:31) at 
org.apache.beam.sdk.Pipeline.run(Pipeline.java:315) at 
org.apache.beam.sdk.Pipeline.run(Pipeline.java:301) at 
org.apache.beam.examples.WordCount.runWordCount(WordCount.java:185) at 
org.apache.beam.examples.WordCount.main(WordCount.java:192)Caused by: 
java.lang.reflect.InvocationTargetException at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.translatePrimitive(PipelineTranslator.java:109)
 ... 14 moreCaused by: java.lang.NoSuchMethodError: 
org.apache.beam.sdk.transforms.ParDo$MultiOutput.getSideInputs()Ljava/util/List;
 at 
org.apache.nemo.compiler.frontend.beam.PipelineTranslator.parDoMultiOutputTranslator(PipelineTranslator.java:236)
 ... 19 more

```



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-5495) PipelineResources algorithm is not working in most environments

2019-12-18 Thread Luke Cwik (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-5495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999358#comment-16999358
 ] 

Luke Cwik commented on BEAM-5495:
-

l would also agree that classgraph is a better choice since it has a much more 
narrow responsibility and is more actively developed in the past couple of 
years but xbean has been around for much longer. Either way the implementation 
that [~ŁukaszG] did is extensible and replaceable. The implementation can be 
overridden by specifying the pipeline resources detector factory option.

Adding a based pipeline resources detector factory that delegates to 
ServiceLoader to find an implementation seems fine as well and could be added 
as a future PR.

> PipelineResources algorithm is not working in most environments
> ---
>
> Key: BEAM-5495
> URL: https://issues.apache.org/jira/browse/BEAM-5495
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink, runner-spark, sdk-java-core
>Reporter: Romain Manni-Bucau
>Assignee: Lukasz Gajowy
>Priority: Major
> Fix For: 2.19.0
>
>  Time Spent: 15h 50m
>  Remaining Estimate: 0h
>
> Issue are:
> 1. it assumes the classloader is an URLClassLoader (not always true and java 
> >= 9 breaks that as well for the app loader)
> 2. it uses loader.getURLs() which leads to including the JRE itself in the 
> staged file
> Looks like this detect resource algorithm can't work and should be replaced 
> by a SPI rather than a built-in and not extensible algorithm. Another valid 
> alternative is to just drop that "guess" logic and force the user to set 
> staged files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-8962) FlinkMetricContainer causes churn in the JobManager and lets the web frontend malfunction

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8962?focusedWorklogId=361381&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361381
 ]

ASF GitHub Bot logged work on BEAM-8962:


Author: ASF GitHub Bot
Created on: 18/Dec/19 16:24
Start Date: 18/Dec/19 16:24
Worklog Time Spent: 10m 
  Work Description: mxm commented on pull request #10415: [BEAM-8962] 
Report Flink metric accumulator only when pipeline ends
URL: https://github.com/apache/beam/pull/10415
 
 
   To avoid the runtime overhead of continuously reporting accumulators from the
   TaskManagers to the JobManager, we can defer the reporting of the metrics
   accumulator until the pipeline shuts down. A final snapshot of the 
accumulators
   will be reported then.
   
   This means the metrics will still be available in the PipelineResult. Of 
course,
   the metrics reporting via Flink is not affected by this change.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/)
 | --- | [![B

[jira] [Work logged] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8695?focusedWorklogId=361379&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361379
 ]

ASF GitHub Bot logged work on BEAM-8695:


Author: ASF GitHub Bot
Created on: 18/Dec/19 16:15
Start Date: 18/Dec/19 16:15
Worklog Time Spent: 10m 
  Work Description: suztomo commented on pull request #10414: [BEAM-8695] 
google-http-client 1.34.0
URL: https://github.com/apache/beam/pull/10414
 
 
   https://issues.apache.org/jira/browse/BEAM-8695.
   
   Upgrading google-http-client version to the latest 1.34.0.
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)[![Build
 
Stat

[jira] [Commented] (BEAM-8695) Beam Dependency Update Request: com.google.http-client:google-http-client

2019-12-18 Thread Tomo Suzuki (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999301#comment-16999301
 ] 

Tomo Suzuki commented on BEAM-8695:
---

The fix has been released: 
https://github.com/googleapis/google-http-java-client/pull/895

$ git tag --contains 38e3a53
v1.34.0


> Beam Dependency Update Request: com.google.http-client:google-http-client
> -
>
> Key: BEAM-8695
> URL: https://issues.apache.org/jira/browse/BEAM-8695
> Project: Beam
>  Issue Type: Sub-task
>  Components: dependencies
>Reporter: Beam JIRA Bot
>Priority: Major
>
>  - 2019-11-15 19:40:13.570557 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-11-19 21:06:20.477284 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-12-02 12:12:12.146269 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 
>  - 2019-12-09 12:11:24.693912 
> -
> Please consider upgrading the dependency 
> com.google.http-client:google-http-client. 
> The current version is 1.28.0. The latest version is 1.33.0 
> cc: 
>  Please refer to [Beam Dependency Guide 
> |https://beam.apache.org/contribute/dependencies/]for more information. 
> Do Not Modify The Description Above. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (BEAM-8528) BigQuery bounded source does not work on DirectRunner

2019-12-18 Thread Kamil Wasilewski (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Wasilewski reassigned BEAM-8528:
--

Assignee: Kamil Wasilewski

> BigQuery bounded source does not work on DirectRunner
> -
>
> Key: BEAM-8528
> URL: https://issues.apache.org/jira/browse/BEAM-8528
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Kamil Wasilewski
>Assignee: Kamil Wasilewski
>Priority: Major
>
> Refer to https://github.com/apache/beam/pull/9772 for more information and 
> the context of this ticket.
> The following exception is being raised when _ReadFromBigQuery_ PTransform is 
> used on DirectRunner in Python SDK:
> {code:java}
>   File 
> "/home/Kamil/projects/beam/sdks/python/apache_beam/io/gcp/bigquery.py", line 
> 639, in get_range_tracker
> raise NotImplementedError('BigQuery source must be split before being 
> read')
> NotImplementedError: BigQuery source must be split before being read
> {code}
>  The direct cause is _get_range_tracker_ and _read_ methods aren't 
> implemented in __BigQuerySource_. This is purposeful — the runner is expected 
> to call _split_ instead. The Java implementation works the same way: 
> [link|https://github.com/apache/beam/blob/c2f0d282337f3ae0196a7717712396a5a41fdde1/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQuerySourceBase.java]
> It seems that DataflowRunner and Flink are able to catch these exceptions 
> somehow, while DirectRunner is not.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (BEAM-8528) BigQuery bounded source does not work on DirectRunner

2019-12-18 Thread Kamil Wasilewski (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Wasilewski updated BEAM-8528:
---
Status: Open  (was: Triage Needed)

> BigQuery bounded source does not work on DirectRunner
> -
>
> Key: BEAM-8528
> URL: https://issues.apache.org/jira/browse/BEAM-8528
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: Kamil Wasilewski
>Priority: Major
>
> Refer to https://github.com/apache/beam/pull/9772 for more information and 
> the context of this ticket.
> The following exception is being raised when _ReadFromBigQuery_ PTransform is 
> used on DirectRunner in Python SDK:
> {code:java}
>   File 
> "/home/Kamil/projects/beam/sdks/python/apache_beam/io/gcp/bigquery.py", line 
> 639, in get_range_tracker
> raise NotImplementedError('BigQuery source must be split before being 
> read')
> NotImplementedError: BigQuery source must be split before being read
> {code}
>  The direct cause is _get_range_tracker_ and _read_ methods aren't 
> implemented in __BigQuerySource_. This is purposeful — the runner is expected 
> to call _split_ instead. The Java implementation works the same way: 
> [link|https://github.com/apache/beam/blob/c2f0d282337f3ae0196a7717712396a5a41fdde1/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQuerySourceBase.java]
> It seems that DataflowRunner and Flink are able to catch these exceptions 
> somehow, while DirectRunner is not.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-8988) apache_beam.io.gcp.bigquery_read_it_test failing with: NotImplementedError: BigQuery source must be split before being read

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8988?focusedWorklogId=361373&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361373
 ]

ASF GitHub Bot logged work on BEAM-8988:


Author: ASF GitHub Bot
Created on: 18/Dec/19 15:33
Start Date: 18/Dec/19 15:33
Worklog Time Spent: 10m 
  Work Description: kamilwu commented on pull request #10412: [BEAM-8988] 
RangeTracker for _CustomBigQuerySource
URL: https://github.com/apache/beam/pull/10412
 
 
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds

[jira] [Updated] (BEAM-8988) apache_beam.io.gcp.bigquery_read_it_test failing with: NotImplementedError: BigQuery source must be split before being read

2019-12-18 Thread Kamil Wasilewski (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Wasilewski updated BEAM-8988:
---
Status: Open  (was: Triage Needed)

> apache_beam.io.gcp.bigquery_read_it_test failing with: NotImplementedError: 
> BigQuery source must be split before being read
> ---
>
> Key: BEAM-8988
> URL: https://issues.apache.org/jira/browse/BEAM-8988
> Project: Beam
>  Issue Type: Bug
>  Components: io-py-gcp
>Reporter: Valentyn Tymofieiev
>Assignee: Kamil Wasilewski
>Priority: Critical
>
> Sample failure: https://builds.apache.org/job/beam_PostCommit_Python37_PR/58/
> Triggered by https://github.com/apache/beam/pull/9772.
> Stacktrace:
> {noformat}
> Pipeline 
> BeamApp-jenkins-1217231928-2108ede4_7476773b-6b06-4536-a0d5-c5fafb6c0935 
> failed in state FAILED: java.lang.RuntimeException: Error received from SDK 
> harness for instruction 96: Traceback (most recent call last):
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/runners/common.py",
>  line 879, in process
> return self.do_fn_invoker.invoke_process(windowed_value)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/runners/common.py",
>  line 669, in invoke_process
> windowed_value, additional_args, additional_kwargs, output_processor)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/runners/common.py",
>  line 747, in _invoke_process_per_window
> windowed_value, self.process_method(*args_for_process))
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/runners/common.py",
>  line 998, in process_outputs
> for result in results:
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/runners/worker/bundle_processor.py",
>  line 1256, in process
> yield element, self.restriction_provider.initial_restriction(element)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/io/iobase.py",
>  line 1518, in initial_restriction
> range_tracker = self._source.get_range_tracker(None, None)
>   File 
> "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37_PR/src/sdks/python/apache_beam/io/gcp/bigquery.py",
>  line 652, in get_range_tracker
> raise NotImplementedError('BigQuery source must be split before being 
> read')
> NotImplementedError: BigQuery source must be split before being read
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-8959) Boolean pipeline options which default to true cannot be set to false

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8959?focusedWorklogId=361370&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361370
 ]

ASF GitHub Bot logged work on BEAM-8959:


Author: ASF GitHub Bot
Created on: 18/Dec/19 15:12
Start Date: 18/Dec/19 15:12
Worklog Time Spent: 10m 
  Work Description: mxm commented on pull request #10411: [BEAM-8959] 
Invert metrics flag in Flink Runner
URL: https://github.com/apache/beam/pull/10411
 
 
   In order for the flag to be useful, we need to invert its logic because flags
   which default to true cannot be set to false.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/)
 | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apa

[jira] [Work started] (BEAM-8978) Report saved data size from HadoopFormatIOIT

2019-12-18 Thread Pawel Pasterz (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8978?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on BEAM-8978 started by Pawel Pasterz.
---
> Report saved data size from HadoopFormatIOIT
> 
>
> Key: BEAM-8978
> URL: https://issues.apache.org/jira/browse/BEAM-8978
> Project: Beam
>  Issue Type: Sub-task
>  Components: testing
>Reporter: Michal Walenia
>Assignee: Pawel Pasterz
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-5495) PipelineResources algorithm is not working in most environments

2019-12-18 Thread Romain Manni-Bucau (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-5495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999186#comment-16999186
 ] 

Romain Manni-Bucau commented on BEAM-5495:
--

[~mxm]think you got abused by github ;) (see 
[https://github.com/classgraph/classgraph/graphs/contributors] vs 
[https://github.com/apache/geronimo-xbean/graphs/contributors]). Anyway, not a 
blocker. Guess it should be made provided/optional in the deployed pom since it 
is a flippable feature (at least at sdk level) and ArchUnit or equivalent can 
be used to ensure it is limited to the scanner and classgraph is not used 
outside of that class to ensure it can still be dropped.

> PipelineResources algorithm is not working in most environments
> ---
>
> Key: BEAM-5495
> URL: https://issues.apache.org/jira/browse/BEAM-5495
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink, runner-spark, sdk-java-core
>Reporter: Romain Manni-Bucau
>Assignee: Lukasz Gajowy
>Priority: Major
> Fix For: 2.19.0
>
>  Time Spent: 15h 50m
>  Remaining Estimate: 0h
>
> Issue are:
> 1. it assumes the classloader is an URLClassLoader (not always true and java 
> >= 9 breaks that as well for the app loader)
> 2. it uses loader.getURLs() which leads to including the JRE itself in the 
> staged file
> Looks like this detect resource algorithm can't work and should be replaced 
> by a SPI rather than a built-in and not extensible algorithm. Another valid 
> alternative is to just drop that "guess" logic and force the user to set 
> staged files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (BEAM-8512) Add integration tests for Python "flink_runner.py"

2019-12-18 Thread Maximilian Michels (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maximilian Michels resolved BEAM-8512.
--
Fix Version/s: (was: Not applicable)
   2.19.0
   Resolution: Fixed

> Add integration tests for Python "flink_runner.py"
> --
>
> Key: BEAM-8512
> URL: https://issues.apache.org/jira/browse/BEAM-8512
> Project: Beam
>  Issue Type: Test
>  Components: runner-flink, sdk-py-core
>Reporter: Maximilian Michels
>Assignee: Kyle Weaver
>Priority: Major
> Fix For: 2.19.0
>
>  Time Spent: 4h
>  Remaining Estimate: 0h
>
> There are currently no integration tests for the Python FlinkRunner. We need 
> a set of tests similar to {{flink_runner_test.py}} which currently use the 
> PortableRunner and not the FlinkRunner.
> CC [~robertwb] [~ibzib] [~thw]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-5495) PipelineResources algorithm is not working in most environments

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-5495?focusedWorklogId=361365&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361365
 ]

ASF GitHub Bot logged work on BEAM-5495:


Author: ASF GitHub Bot
Created on: 18/Dec/19 13:25
Start Date: 18/Dec/19 13:25
Worklog Time Spent: 10m 
  Work Description: lgajowy commented on pull request #10410: [BEAM-5495] 
PR 10268 followup
URL: https://github.com/apache/beam/pull/10410
 
 
   Since PR-10268 was reviewed, merged and then reviewed again, I'm posting a 
follow-up PR to address the comments from the last review. 
   
   I still **have not** adressed 2 comments:
- https://github.com/apache/beam/pull/10268#discussion_r359024833 (still 
investigating)
- https://github.com/apache/beam/pull/10268#discussion_r359023901 (waiting 
for reviewer response)
   
   We can either fix them in this or some other prs later (if needed).
   
   R: @lukecwik 
   R: @mxm
   
   Thanks!
   
   
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_Val

[jira] [Work logged] (BEAM-8978) Report saved data size from HadoopFormatIOIT

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8978?focusedWorklogId=361362&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361362
 ]

ASF GitHub Bot logged work on BEAM-8978:


Author: ASF GitHub Bot
Created on: 18/Dec/19 13:12
Start Date: 18/Dec/19 13:12
Worklog Time Spent: 10m 
  Work Description: pawelpasterz commented on pull request #10409: 
[BEAM-8978] Publish table size of data written during HadoopFormatIOIT
URL: https://github.com/apache/beam/pull/10409
 
 
   Table size of written test data is now reported with metrics publisher
   
   **Please** add a meaningful description for your change here
   
   
   
   Thank you for your contribution! Follow this checklist to help us 
incorporate your contribution quickly and easily:
   
- [ ] [**Choose 
reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and 
mention them in a comment (`R: @username`).
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more 
tips on [how to make review process 
smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/)
 | --- | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/)
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)[![Build
 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://builds.apache.org/j

[jira] [Commented] (BEAM-5495) PipelineResources algorithm is not working in most environments

2019-12-18 Thread Maximilian Michels (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-5495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999026#comment-16999026
 ] 

Maximilian Michels commented on BEAM-5495:
--

To address your other concern:

{quote}
Any reason to use a one man github project (io.github.classgraph:classgraph) 
instead of apache xbean proposal?
{quote}

ClassGraph: 4097 commits, 33 contributors, 304 releases.
Xbean: 992 commits, 12 contributors, 48 releases.

ClassGraph appears to be more active. I value your concern but I think 
ClassGraph was a valid choice, also given the that it scoped for exactly this 
purpose, whereas xbean seems to be written for a different purpose and its 
documentation does not make it obvious how to use it for detecting classloader 
dependencies.

> PipelineResources algorithm is not working in most environments
> ---
>
> Key: BEAM-5495
> URL: https://issues.apache.org/jira/browse/BEAM-5495
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink, runner-spark, sdk-java-core
>Reporter: Romain Manni-Bucau
>Assignee: Lukasz Gajowy
>Priority: Major
> Fix For: 2.19.0
>
>  Time Spent: 15h 40m
>  Remaining Estimate: 0h
>
> Issue are:
> 1. it assumes the classloader is an URLClassLoader (not always true and java 
> >= 9 breaks that as well for the app loader)
> 2. it uses loader.getURLs() which leads to including the JRE itself in the 
> staged file
> Looks like this detect resource algorithm can't work and should be replaced 
> by a SPI rather than a built-in and not extensible algorithm. Another valid 
> alternative is to just drop that "guess" logic and force the user to set 
> staged files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8966) failure in :sdks:python:test-suites:direct:py37:hdfsIntegrationTest

2019-12-18 Thread Kamil Wasilewski (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999019#comment-16999019
 ] 

Kamil Wasilewski commented on BEAM-8966:


Fixed by https://github.com/apache/beam/pull/10400

> failure in :sdks:python:test-suites:direct:py37:hdfsIntegrationTest
> ---
>
> Key: BEAM-8966
> URL: https://issues.apache.org/jira/browse/BEAM-8966
> Project: Beam
>  Issue Type: Bug
>  Components: test-failures
>Reporter: Udi Meiri
>Assignee: Chad Dombrova
>Priority: Major
>
> I believe this is due to https://github.com/apache/beam/pull/9915
> {code}
> Collecting mypy-protobuf==1.12
>   Using cached 
> https://files.pythonhosted.org/packages/b6/28/041dea47c93564bfc0ece050362894292ec4f173caa92fa82994a6d061d1/mypy_protobuf-1.12-py3-none-any.whl
> Installing collected packages: mypy-protobuf
> Successfully installed mypy-protobuf-1.12
> beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but 
> not used.
> beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not 
> used.
> Traceback (most recent call last):
>   File "/usr/local/bin/protoc-gen-mypy", line 13, in 
> import google.protobuf.descriptor_pb2 as d
> ModuleNotFoundError: No module named 'google'
> --mypy_out: protoc-gen-mypy: Plugin failed with status code 1.
> Process Process-1:
> Traceback (most recent call last):
>   File "/app/sdks/python/gen_protos.py", line 104, in generate_proto_files
> from grpc_tools import protoc
> ModuleNotFoundError: No module named 'grpc_tools'
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.7/multiprocessing/process.py", line 297, in 
> _bootstrap
> self.run()
>   File "/usr/local/lib/python3.7/multiprocessing/process.py", line 99, in run
> self._target(*self._args, **self._kwargs)
>   File "/app/sdks/python/gen_protos.py", line 189, in 
> _install_grpcio_tools_and_generate_proto_files
> generate_proto_files()
>   File "/app/sdks/python/gen_protos.py", line 144, in generate_proto_files
> '%s' % ret_code)
> RuntimeError: Protoc returned non-zero status (see logs for details): 1
> Traceback (most recent call last):
>   File "/app/sdks/python/gen_protos.py", line 104, in generate_proto_files
> from grpc_tools import protoc
> ModuleNotFoundError: No module named 'grpc_tools'
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File "setup.py", line 295, in 
> 'mypy': generate_protos_first(mypy),
>   File "/usr/local/lib/python3.7/site-packages/setuptools/__init__.py", line 
> 145, in setup
> return distutils.core.setup(**attrs)
>   File "/usr/local/lib/python3.7/distutils/core.py", line 148, in setup
> dist.run_commands()
>   File "/usr/local/lib/python3.7/distutils/dist.py", line 966, in run_commands
> self.run_command(cmd)
>   File "/usr/local/lib/python3.7/distutils/dist.py", line 985, in run_command
> cmd_obj.run()
>   File "/usr/local/lib/python3.7/site-packages/setuptools/command/sdist.py", 
> line 44, in run
> self.run_command('egg_info')
>   File "/usr/local/lib/python3.7/distutils/cmd.py", line 313, in run_command
> self.distribution.run_command(command)
>   File "/usr/local/lib/python3.7/distutils/dist.py", line 985, in run_command
> cmd_obj.run()
>   File "setup.py", line 220, in run
> gen_protos.generate_proto_files(log=log)
>   File "/app/sdks/python/gen_protos.py", line 121, in generate_proto_files
> raise ValueError("Proto generation failed (see log for details).")
> ValueError: Proto generation failed (see log for details).
> Service 'test' failed to build: The command '/bin/sh -c cd sdks/python && 
> python setup.py sdist && pip install --no-cache-dir $(ls 
> dist/apache-beam-*.tar.gz | tail -n1)[gcp]' returned a non-zero code: 1
> {code}
> https://builds.apache.org/job/beam_PostCommit_Python37/1114/consoleText



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8979) protoc-gen-mypy: program not found or is not executable

2019-12-18 Thread Kamil Wasilewski (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999018#comment-16999018
 ] 

Kamil Wasilewski commented on BEAM-8979:


[https://github.com/apache/beam/pull/10400] was merged and the tests are now 
working. I'll leave this issue unresolved, so that we would remember about it 
when bringing back mypy-protobuf

> protoc-gen-mypy: program not found or is not executable
> ---
>
> Key: BEAM-8979
> URL: https://issues.apache.org/jira/browse/BEAM-8979
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core, test-failures
>Reporter: Kamil Wasilewski
>Assignee: Chad Dombrova
>Priority: Major
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> In some tests, `:sdks:python:sdist:` task fails due to problems in finding 
> protoc-gen-mypy. The following tests are affected (there might be more):
>  * 
> [https://builds.apache.org/job/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/]
>  * 
> [https://builds.apache.org/job/beam_BiqQueryIO_Write_Performance_Test_Python_Batch/
>  
> |https://builds.apache.org/job/beam_BiqQueryIO_Write_Performance_Test_Python_Batch/]
> Relevant logs:
> {code:java}
> 10:46:32 > Task :sdks:python:sdist FAILED
> 10:46:32 Requirement already satisfied: mypy-protobuf==1.12 in 
> /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/src/build/gradleenv/192237/lib/python3.7/site-packages
>  (1.12)
> 10:46:32 beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto 
> but not used.
> 10:46:32 beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto 
> but not used.
> 10:46:32 protoc-gen-mypy: program not found or is not executable
> 10:46:32 --mypy_out: protoc-gen-mypy: Plugin failed with status code 1.
> 10:46:32 
> /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/src/build/gradleenv/192237/lib/python3.7/site-packages/setuptools/dist.py:476:
>  UserWarning: Normalizing '2.19.0.dev' to '2.19.0.dev0'
> 10:46:32   normalized_version,
> 10:46:32 Traceback (most recent call last):
> 10:46:32   File "setup.py", line 295, in 
> 10:46:32 'mypy': generate_protos_first(mypy),
> 10:46:32   File 
> "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/src/build/gradleenv/192237/lib/python3.7/site-packages/setuptools/__init__.py",
>  line 145, in setup
> 10:46:32 return distutils.core.setup(**attrs)
> 10:46:32   File "/usr/lib/python3.7/distutils/core.py", line 148, in setup
> 10:46:32 dist.run_commands()
> 10:46:32   File "/usr/lib/python3.7/distutils/dist.py", line 966, in 
> run_commands
> 10:46:32 self.run_command(cmd)
> 10:46:32   File "/usr/lib/python3.7/distutils/dist.py", line 985, in 
> run_command
> 10:46:32 cmd_obj.run()
> 10:46:32   File 
> "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/src/build/gradleenv/192237/lib/python3.7/site-packages/setuptools/command/sdist.py",
>  line 44, in run
> 10:46:32 self.run_command('egg_info')
> 10:46:32   File "/usr/lib/python3.7/distutils/cmd.py", line 313, in 
> run_command
> 10:46:32 self.distribution.run_command(command)
> 10:46:32   File "/usr/lib/python3.7/distutils/dist.py", line 985, in 
> run_command
> 10:46:32 cmd_obj.run()
> 10:46:32   File "setup.py", line 220, in run
> 10:46:32 gen_protos.generate_proto_files(log=log)
> 10:46:32   File 
> "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_37_ParDo_Dataflow_Batch_PR/src/sdks/python/gen_protos.py",
>  line 144, in generate_proto_files
> 10:46:32 '%s' % ret_code)
> 10:46:32 RuntimeError: Protoc returned non-zero status (see logs for 
> details): 1
> {code}
>  
> This is what I have tried so far to resolve this (without being successful):
>  * Including _--plugin=protoc-gen-mypy=\{abs_path_to_executable}_ parameter 
> to the _protoc_ call ingen_protos.py:131
>  * Appending protoc-gen-mypy's directory to the PATH variable
> I wasn't able to reproduce this error locally.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (BEAM-5690) Issue with GroupByKey in BeamSql using SparkRunner

2019-12-18 Thread Etienne Chauchot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-5690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Etienne Chauchot resolved BEAM-5690.

Fix Version/s: 2.19.0
   Resolution: Fixed

> Issue with GroupByKey in BeamSql using SparkRunner
> --
>
> Key: BEAM-5690
> URL: https://issues.apache.org/jira/browse/BEAM-5690
> Project: Beam
>  Issue Type: Task
>  Components: runner-spark
>Reporter: Kenneth Knowles
>Priority: Major
> Fix For: 2.19.0
>
>  Time Spent: 4.5h
>  Remaining Estimate: 0h
>
> Reported on user@
> {quote}We are trying to setup a pipeline with using BeamSql and the trigger 
> used is default (AfterWatermark crosses the window). 
> Below is the pipeline:
>   
>KafkaSource (KafkaIO) 
>---> Windowing (FixedWindow 1min)
>---> BeamSql
>---> KafkaSink (KafkaIO)
>  
> We are using Spark Runner for this. 
> The BeamSql query is:
> {code}select Col3, count(*) as count_col1 from PCOLLECTION GROUP BY Col3{code}
> We are grouping by Col3 which is a string. It can hold values string[0-9]. 
>  
> The records are getting emitted out at 1 min to kafka sink, but the output 
> record in kafka is not as expected.
> Below is the output observed: (WST and WET are indicators for window start 
> time and window end time)
> {code}
> {"count_col1":1,"Col3":"string5","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":3,"Col3":"string7","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":2,"Col3":"string8","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string2","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00 0}
> {code}
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-5690) Issue with GroupByKey in BeamSql using SparkRunner

2019-12-18 Thread Etienne Chauchot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-5690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999012#comment-16999012
 ] 

Etienne Chauchot commented on BEAM-5690:


Yes, thanks for pointing out [~iemejia], I forgot to close the related jira.

> Issue with GroupByKey in BeamSql using SparkRunner
> --
>
> Key: BEAM-5690
> URL: https://issues.apache.org/jira/browse/BEAM-5690
> Project: Beam
>  Issue Type: Task
>  Components: runner-spark
>Reporter: Kenneth Knowles
>Priority: Major
>  Time Spent: 4.5h
>  Remaining Estimate: 0h
>
> Reported on user@
> {quote}We are trying to setup a pipeline with using BeamSql and the trigger 
> used is default (AfterWatermark crosses the window). 
> Below is the pipeline:
>   
>KafkaSource (KafkaIO) 
>---> Windowing (FixedWindow 1min)
>---> BeamSql
>---> KafkaSink (KafkaIO)
>  
> We are using Spark Runner for this. 
> The BeamSql query is:
> {code}select Col3, count(*) as count_col1 from PCOLLECTION GROUP BY Col3{code}
> We are grouping by Col3 which is a string. It can hold values string[0-9]. 
>  
> The records are getting emitted out at 1 min to kafka sink, but the output 
> record in kafka is not as expected.
> Below is the output observed: (WST and WET are indicators for window start 
> time and window end time)
> {code}
> {"count_col1":1,"Col3":"string5","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":3,"Col3":"string7","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":2,"Col3":"string8","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string2","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00 0}
> {code}
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-8671) Migrate Python version to 3.7

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-8671?focusedWorklogId=361339&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361339
 ]

ASF GitHub Bot logged work on BEAM-8671:


Author: ASF GitHub Bot
Created on: 18/Dec/19 08:47
Start Date: 18/Dec/19 08:47
Worklog Time Spent: 10m 
  Work Description: kamilwu commented on issue #10125: [BEAM-8671] Added 
ParDo test running on Python 3.7
URL: https://github.com/apache/beam/pull/10125#issuecomment-566936037
 
 
   Run Seed Job
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 361339)
Time Spent: 11.5h  (was: 11h 20m)

> Migrate Python version to 3.7
> -
>
> Key: BEAM-8671
> URL: https://issues.apache.org/jira/browse/BEAM-8671
> Project: Beam
>  Issue Type: Sub-task
>  Components: testing
>Reporter: Kamil Wasilewski
>Assignee: Kamil Wasilewski
>Priority: Major
>  Time Spent: 11.5h
>  Remaining Estimate: 0h
>
> Currently, load tests run on Python 2.7. We should migrate to 3.7



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-5690) Issue with GroupByKey in BeamSql using SparkRunner

2019-12-18 Thread Jira


[ 
https://issues.apache.org/jira/browse/BEAM-5690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998942#comment-16998942
 ] 

Ismaël Mejía commented on BEAM-5690:


The PR on this one is already merged. can we resolve this one [~echauchot] ?

> Issue with GroupByKey in BeamSql using SparkRunner
> --
>
> Key: BEAM-5690
> URL: https://issues.apache.org/jira/browse/BEAM-5690
> Project: Beam
>  Issue Type: Task
>  Components: runner-spark
>Reporter: Kenneth Knowles
>Priority: Major
>  Time Spent: 4.5h
>  Remaining Estimate: 0h
>
> Reported on user@
> {quote}We are trying to setup a pipeline with using BeamSql and the trigger 
> used is default (AfterWatermark crosses the window). 
> Below is the pipeline:
>   
>KafkaSource (KafkaIO) 
>---> Windowing (FixedWindow 1min)
>---> BeamSql
>---> KafkaSink (KafkaIO)
>  
> We are using Spark Runner for this. 
> The BeamSql query is:
> {code}select Col3, count(*) as count_col1 from PCOLLECTION GROUP BY Col3{code}
> We are grouping by Col3 which is a string. It can hold values string[0-9]. 
>  
> The records are getting emitted out at 1 min to kafka sink, but the output 
> record in kafka is not as expected.
> Below is the output observed: (WST and WET are indicators for window start 
> time and window end time)
> {code}
> {"count_col1":1,"Col3":"string5","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":3,"Col3":"string7","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":2,"Col3":"string8","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string2","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":1,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00   +"}
> {"count_col1":0,"Col3":"string6","WST":"2018-10-09  09-55-00   
> +","WET":"2018-10-09  09-56-00 0}
> {code}
> {quote}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-7949) Add time-based cache threshold support in the data service of the Python SDK harness

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-7949?focusedWorklogId=361334&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361334
 ]

ASF GitHub Bot logged work on BEAM-7949:


Author: ASF GitHub Bot
Created on: 18/Dec/19 08:30
Start Date: 18/Dec/19 08:30
Worklog Time Spent: 10m 
  Work Description: sunjincheng121 commented on issue #10246: [BEAM-7949] 
Add time-based cache threshold support in the data service of the Python SDK 
harness
URL: https://github.com/apache/beam/pull/10246#issuecomment-566930239
 
 
   Thanks for the review @robertwb, I have update the PR accordingly. 
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 361334)
Time Spent: 2h 40m  (was: 2.5h)

> Add time-based cache threshold support in the data service of the Python SDK 
> harness
> 
>
> Key: BEAM-7949
> URL: https://issues.apache.org/jira/browse/BEAM-7949
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-harness
>Reporter: sunjincheng
>Priority: Major
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> Currently only size-based cache threshold is supported in the data service of 
> Python SDK harness. It should also support the time-based cache threshold. 
> This is very important, especially for streaming jobs which are sensitive to 
> the delay. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Work logged] (BEAM-7949) Add time-based cache threshold support in the data service of the Python SDK harness

2019-12-18 Thread ASF GitHub Bot (Jira)


 [ 
https://issues.apache.org/jira/browse/BEAM-7949?focusedWorklogId=361333&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361333
 ]

ASF GitHub Bot logged work on BEAM-7949:


Author: ASF GitHub Bot
Created on: 18/Dec/19 08:26
Start Date: 18/Dec/19 08:26
Worklog Time Spent: 10m 
  Work Description: sunjincheng121 commented on pull request #10246: 
[BEAM-7949] Add time-based cache threshold support in the data service of the 
Python SDK harness
URL: https://github.com/apache/beam/pull/10246#discussion_r359209509
 
 

 ##
 File path: sdks/python/apache_beam/runners/portability/fn_api_runner.py
 ##
 @@ -126,6 +126,9 @@
 # The cache is disabled in production for other runners.
 STATE_CACHE_SIZE = 100
 
+# Time-based flush is enabled in the fn_api_runner for testing.
+DATA_BUFFER_TIME_LIMIT = 1000
 
 Review comment:
   Yes, It's milliseconds.
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 361333)
Time Spent: 2.5h  (was: 2h 20m)

> Add time-based cache threshold support in the data service of the Python SDK 
> harness
> 
>
> Key: BEAM-7949
> URL: https://issues.apache.org/jira/browse/BEAM-7949
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-harness
>Reporter: sunjincheng
>Priority: Major
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> Currently only size-based cache threshold is supported in the data service of 
> Python SDK harness. It should also support the time-based cache threshold. 
> This is very important, especially for streaming jobs which are sensitive to 
> the delay. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)