[beam] branch go-sdk updated (e2d7408 -> 5e6db92)

2018-02-13 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git.


from e2d7408  Merge pull request #4654: CoGBK fixup - rename file
 add 77847f6  Sickbay flakey KinesisReaderTest
 add 5a5e71c  Merge pull request #4552: [BEAM-3317] Sickbay flakey 
KinesisReaderTest
 add b52c385  [BEAM-3490] Make runtime type checking code runner agnostic.
 add de7bf0a  Direct runner fixes.
 add bddcb0c  Merge pull request #4534 [BEAM-3490] Make runtime type 
checking code runner agnostic.
 add ebf4252  google-java-format
 add 4d5e852  Fix Distinct null pointer error with speculative triggers
 add ff37337  Merge pull request #4536: [BEAM-3423] Fix Distinct null 
pointer error with speculative triggers
 add d0de803  use build $WORKSPACE as pkb temp_dir and update pip and 
setuptools in virtualenv
 add dd9a513  Merge pull request #4537: [BEAM-3480] use build $WORKSPACE as 
pkb temp_dir
 add a01a85b  Move TestCountingSource to appropriate location
 add 76202e0  Merge pull request #4545: [BEAM-3573] Move TestCountingSource 
to appropriate location
 add c3fa9e2  Support argparse-style choices for ValueProvider
 add 4a8a00d  Merge pull request #4518 from mariapython/choices
 add 51d6171  Sickbay ApexRunner gradle WordCountIT
 add 88de0a1  Merge pull request #4551: [BEAM-3583] Sickbay ApexRunner 
gradle WordCountIT
 add ca27144  [BEAM-3249] Make sure that all java projects package tests. 
Also package shaded classes if shading is enabled.
 add 82e5e94  [BEAM-3249] Do not assume build directory is within build/, 
use the project defined build dir.
 add e1b6fb7  Change info to debug statement
 add afa7e86  Change info to debug statement
 add 2fe7169  Fix undefined names: exc_info --> self.exc_info
 add 51da92c  Merge pull request #4559 from cclauss/patch-1
 add a71042a  import logging for line 1163
 add a2bf73f  Merge pull request #4560 [lint] import logging
 add 34eadc5  Encourage a good description in a good spot on a PR 
description.
 add ee99265  Merge pull request #4566: Encourage a good description in a 
good spot on a PR description.
 add 9897be0  [BEAM-3562] Update to Checkstyle 8.7
 add ec7e098  Merge pull request #4522: [BEAM-3562] Update to Checkstyle 8.7
 add ece8709  Changing FileNaming to public to allow for usage in 
lambdas/inheritance outside of the package.
 add e053eb5  Merge pull request #4568: Changing FileIO.Write.FileNaming 
Interface to public
 add 9a2d2a6  Add QueryablePipeline
 add 2b242fe  Merge pull request #4530: Add QueryablePipeline
 add fe2de5e  Split out buffered read and write code from gcsio.
 add e34fee1  Merge pull request #4471: [BEAM-3099] Split out 
BufferedReader and BufferedWriter from gcsio.
 add 884f3e6  Introduces the Wait transform
 add 9cf86bc  Merge pull request #4301: Introduces the Wait transform
 add 39ab03b  [BEAM-3551] Define compiler -parameters flag in the default 
options
 add 6831f2c  Merge pull request #4584: [BEAM-3551] Define compiler 
-parameters flag in the default options
 add ef12700  Reduce the flakiness of the state sampler progress metrics.
 add 537b9b7  Merge pull request #4576 Reduce the flakiness of the state 
sampler progress metrics.
 add 2bbcb12  Move off of deprecated method in Guava.
 add ecd89b8  [SQL] Inherit windowing strategy from the input in Aggregate 
operation
 add 21cc003  Merge pull request #4546: [SQL] Inherit windowing strategy 
from the input in Aggregate operation
 add 0dec2e7  BEAM-3593 - Remove methods that just call super()
 add c66832c  BEAM-3593 - Remove methods that just call super()
 add 1eb9443  Add SdkHarnessClientControlService
 add ae603d2  Update Synchronization in FnApiControlClient
 add 5e6520a  Merge pull request #4569: Add `SdkHarnessClientControlService`
 add e832cfb  Logging deviation from sampling expectation. This will allow 
to track performance variation in statesampler over time.
 add 4f6415b  Merge pull request #4531 from pabloem/log-sampler-deviation
 add 55d8723  Adding a static getter for RuntimeValueProvider.
 add 25887bc  Addressing comments.
 add 5fe88ff  Removing unnecessary code.
 add 504ce70  Merge pull request #4539 from 
pabloem/static-runtimevalueprovider
 add 5c01e85  Add a LocalArtifactStagingLocation
 add c26191d  Add LocalArtifactStagingLocation#forExisting
 add e2432e2  Add an ArtifactRetrievalService interface
 add 7a537b9  Implement a Local ArtifactRetrievalService
 add af864b8  Merge pull request #4422: Implement a Local Artifact 
Retrieval service
 add fd07d72  global INT64_MAX, INT64_MIN to placate linters
 add 1693e7d  Merge pull request #4562 global INT64_MAX, INT64_MIN to 
placate linters
 add 28ce7a5  Adds a ReadAll transform to tfrecordio.
   

[beam] 01/01: Merge pull request #4682: [BEAM-3684] Integrate master to go-sdk and fixup coder urns

2018-02-13 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 5e6db92862521b0c54e4b3e532ae98883c7f5098
Merge: e2d7408 ed922cb
Author: Chamikara Jayalath 
AuthorDate: Tue Feb 13 23:28:34 2018 -0800

Merge pull request #4682: [BEAM-3684] Integrate master to go-sdk and fixup 
coder urns

 .github/PULL_REQUEST_TEMPLATE.md   |   13 +-
 .gitignore |1 +
 .test-infra/jenkins/common_job_properties.groovy   |4 +
 ...am_PostCommit_Java_ValidatesRunner_Flink.groovy |2 +-
 ...am_PostCommit_Java_ValidatesRunner_Spark.groovy |2 +-
 ...mmit_Python_ValidatesContainer_Dataflow.groovy} |   37 +-
 ...vy => job_beam_PreCommit_Go_GradleBuild.groovy} |   19 +-
 .../job_beam_PreCommit_Java_GradleBuild.groovy |4 +-
 build.gradle   |   12 +-
 build_rules.gradle |  102 +-
 examples/java/build.gradle |   12 +-
 .../apache/beam/examples/WindowedWordCountIT.java  |5 -
 .../complete/game/StatefulTeamScoreTest.java   |2 -
 gradle/wrapper/gradle-wrapper.jar  |  Bin 54712 -> 54333 bytes
 gradle/wrapper/gradle-wrapper.properties   |2 +-
 model/fn-execution/build.gradle|7 -
 .../fn-execution/src/main/proto/beam_fn_api.proto  |9 +-
 model/job-management/build.gradle  |7 -
 model/pipeline/build.gradle|7 -
 .../resources/org/apache/beam/model/common_urns.md |  134 ++
 pom.xml|   60 +-
 runners/apex/build.gradle  |7 -
 .../runners/apex/translation/ParDoTranslator.java  |2 +-
 .../translation/operators/ApexParDoOperator.java   |2 +-
 .../apex/translation/ParDoTranslatorTest.java  |2 +
 runners/core-construction-java/build.gradle|7 -
 .../core/construction/ArtifactServiceStager.java   |   32 +-
 .../runners/core/construction/Environments.java|2 +-
 .../core/construction/ImpulseTranslation.java  |   65 +
 .../core/construction/ModelCoderRegistrar.java |   18 +-
 .../core/construction/PTransformTranslation.java   |   27 +-
 .../beam/runners/core/construction/UrnUtils.java   |   63 +
 .../core/construction/WindowIntoTranslation.java   |9 +-
 .../construction/WindowingStrategyTranslation.java |7 +
 .../core/construction/graph/ExecutableStage.java   |   84 ++
 .../graph/GreedilyFusedExecutableStage.java|  195 +++
 .../graph/GreedyPCollectionFusers.java |  257 
 .../core/construction/graph/PipelineNode.java  |   55 +
 .../core/construction/graph/QueryablePipeline.java |  281 
 .../core/construction/graph}/package-info.java |   12 +-
 .../core/construction/EnvironmentsTest.java|4 +-
 .../core/construction/PTransformMatchersTest.java  |2 +-
 .../core/construction/ParDoTranslationTest.java|4 +-
 .../core/construction/SplittableParDoTest.java |7 +-
 .../runners/core/construction/UrnUtilsTest.java|   42 +-
 .../graph/GreedilyFusedExecutableStageTest.java|  826 +++
 .../construction/graph/QueryablePipelineTest.java  |  389 ++
 .../construction/metrics/MetricFilteringTest.java  |2 -
 runners/core-java/build.gradle |7 -
 .../beam/runners/core/InMemoryStateInternals.java  |   77 +-
 ...TimeBoundedSplittableProcessElementInvoker.java |  100 +-
 .../apache/beam/runners/core/SimpleDoFnRunner.java |8 +-
 .../core/SplittableParDoViaKeyedWorkItems.java |4 +-
 .../core/SplittableProcessElementInvoker.java  |2 +-
 ...BoundedSplittableProcessElementInvokerTest.java |  111 +-
 .../runners/core/SplittableParDoProcessFnTest.java |   34 +-
 .../core/metrics/MetricsContainerStepMapTest.java  |3 +-
 runners/direct-java/build.gradle   |7 -
 .../direct/CopyOnAccessInMemoryStateInternals.java |   10 +-
 .../apache/beam/runners/direct/DirectOptions.java  |8 +
 .../apache/beam/runners/direct/DirectRunner.java   |   15 +-
 .../SplittableProcessElementsEvaluatorFactory.java |6 +-
 .../runners/direct/TransformEvaluatorRegistry.java |4 +-
 .../direct/UnboundedReadEvaluatorFactory.java  |3 -
 .../beam/runners/direct/DirectMetricsTest.java |2 +-
 .../direct/DirectTransformExecutorTest.java|1 -
 .../direct/FlattenEvaluatorFactoryTest.java|   22 +-
 runners/flink/build.gradle |7 -
 .../flink/FlinkBatchTransformTranslators.java  |3 +-
 .../flink/FlinkPipelineExecutionEnvironment.java   |5 +-
 .../flink/FlinkStreamingTransformTranslators.java  |8 +-
 .../flink/PipelineTranslationOptimizer.java|   16 +-
 .../wrappers/streaming/DoFnOperator.java   |   30 +-
 .../wrappers/streaming/SplittableDoFnOperator.java

[jira] [Commented] (BEAM-3622) DirectRunner memory issue with Python SDK

2018-02-13 Thread yuri krnr (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16363576#comment-16363576
 ] 

yuri krnr commented on BEAM-3622:
-

cool! will be looking forward for the results

> DirectRunner memory issue with Python SDK
> -
>
> Key: BEAM-3622
> URL: https://issues.apache.org/jira/browse/BEAM-3622
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: yuri krnr
>Assignee: Charles Chen
>Priority: Major
>
> After running pipeline for a while in a streaming mode (reading from Pub/Sub 
> and writing to BigQuery, Datastore and another Pub/Sub) I noticed drastic 
> memory usage of a process. Using guppy as a profiler I got the following 
> results:
> start
> {noformat}
>  INFO *** MemoryReport Heap:
>  Partition of a set of 240208 objects. Total size = 34988840 bytes.
>  Index  Count   % Size   % Cumulative  % Kind (class / dict of class)
>  0  88289  37  8696984  25   8696984  25 str
>  1  5  22  4897352  14  13594336  39 tuple
>  2   5083   2  2790664   8  16385000  47 dict (no owner)
>  3   1939   1  1749656   5  18134656  52 type
>  4699   0  1723272   5  19857928  57 dict of module
>  5  12337   5  1579136   5  21437064  61 types.CodeType
>  6  12403   5  1488360   4  22925424  66 function
>  7   1939   1  1452616   4  24378040  70 dict of type
>  8677   0   709496   2  25087536  72 dict of 0x1e4d880
>  9  25603  11   614472   2  25702008  73 int
> <1103 more rows. Type e.g. '_.more' to view.>
> {noformat}
> after several hours of running
> {noformat}
> INFO *** MemoryReport Heap:
>  Partition of a set of 1255662 objects. Total size = 315029632 bytes.
>  Index  Count   % Size   % Cumulative  % Kind (class / dict of class)
>  0  95554   8 99755056  32  99755056  32 dict of
>  
> apache_beam.runners.direct.bundle_factory._Bundle
>  1 117943   9 54193192  17 153948248  49 dict (no owner)
>  2 161068  13 27169296   9 181117544  57 unicode
>  3  94571   8 26479880   8 207597424  66 dict of apache_beam.pvalue.PBegin
>  4 126461  10 12715336   4 220312760  70 str
>  5  44374   4 12424720   4 232737480  74 dict of 
> apitools.base.protorpclite.messages.FieldList
>  6  44374   4  6348624   2 239086104  76 
> apitools.base.protorpclite.messages.FieldList
>  7  95556   8  6115584   2 245201688  78 
> apache_beam.runners.direct.bundle_factory._Bundle
>  8  94571   8  6052544   2 251254232  80 apache_beam.pvalue.PBegin
>  9  57371   5  5218424   2 256472656  81 tuple
> <1187 more rows. Type e.g. '_.more' to view.>
> {noformat}
>  
> I see that every bundle still sits in memory and all its data too. why aren't 
> the gc-ed?
> What is the policy for gc for the dataflow processes?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build became unstable: beam_PostCommit_Java_ValidatesRunner_Apex #3472

2018-02-13 Thread Apache Jenkins Server
See 




[beam-site] 02/02: This closes #385

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 4cb377a9d36a9f8f3fe77c13e4b13a25dc8f2769
Merge: 297a3ad 63f7ec8
Author: Mergebot 
AuthorDate: Tue Feb 13 22:38:09 2018 -0800

This closes #385

 src/documentation/sdks/python-type-safety.md | 6 --
 1 file changed, 4 insertions(+), 2 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch mergebot updated (a574a33 -> 4cb377a)

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


from a574a33  This closes #386
 add 297a3ad  Prepare repository for deployment.
 new 63f7ec8  Clarify runtime type checking documentation
 new 4cb377a  This closes #385

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 content/documentation/programming-guide/index.html | 37 +++---
 .../documentation/sdks/python-custom-io/index.html | 32 +++
 src/documentation/sdks/python-type-safety.md   |  6 ++--
 3 files changed, 40 insertions(+), 35 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/02: Clarify runtime type checking documentation

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 63f7ec889d71ab7a65a19ae6b1efd404eb30353d
Author: Charles Chen 
AuthorDate: Mon Feb 5 16:28:03 2018 -0800

Clarify runtime type checking documentation
---
 src/documentation/sdks/python-type-safety.md | 6 --
 1 file changed, 4 insertions(+), 2 deletions(-)

diff --git a/src/documentation/sdks/python-type-safety.md 
b/src/documentation/sdks/python-type-safety.md
index ae8fc09..3962fe1 100644
--- a/src/documentation/sdks/python-type-safety.md
+++ b/src/documentation/sdks/python-type-safety.md
@@ -106,16 +106,18 @@ The following are special type hints that don't 
correspond to a class, but rathe
 
 In addition to using type hints for type checking at pipeline construction, 
you can enable runtime type checking to check that actual elements satisfy the 
declared type constraints during pipeline execution.
 
-For example, the following code would pass at both pipeline construction and 
runtime.
+For example, the following pipeline emits elements of the wrong type. 
Depending on the runner implementation, its execution may or may not fail at 
runtime.
 
 ```
 {% github_sample 
/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/snippets_test.py
 tag:type_hints_runtime_off %}```
 
-However, if you enable runtime type checking, the code passes at pipeline 
construction and fails at runtime. To enable runtime type checking, set the 
pipeline option `runtime_type_check` to `True`.
+However, if you enable runtime type checking, the code is guaranteed to fail 
at runtime. To enable runtime type checking, set the pipeline option 
`runtime_type_check` to `True`.
 
 ```
 {% github_sample 
/apache/beam/blob/master/sdks/python/apache_beam/examples/snippets/snippets_test.py
 tag:type_hints_runtime_on %}```
 
+Note that because runtime type checks are done for each `PCollection` element, 
enabling this feature may incur a significant performance penalty. It is 
therefore recommended that runtime type checks are disabled for production 
pipelines.
+
 ## Use of Type Hints in Coders
 
 When your pipeline reads, writes, or otherwise materializes its data, the 
elements in your `PCollection` need to be encoded and decoded to and from byte 
strings. Byte strings are used for intermediate storage, for comparing keys in 
`GroupByKey` operations, and for reading from sources and writing to sinks.

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch asf-site updated (21477ff -> 297a3ad)

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


from 21477ff  Prepare repository for deployment.
 add a0bc5ff  Replaces sources/sinks in programming guide with IOs
 add a574a33  This closes #386
 new 297a3ad  Prepare repository for deployment.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 content/documentation/programming-guide/index.html | 37 +++---
 .../documentation/sdks/python-custom-io/index.html | 32 +++
 src/documentation/programming-guide.md | 33 ++-
 3 files changed, 52 insertions(+), 50 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/01: Prepare repository for deployment.

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit 297a3adfbf0250e348333b962830ef68b3093c9c
Author: Mergebot 
AuthorDate: Tue Feb 13 22:29:48 2018 -0800

Prepare repository for deployment.
---
 content/documentation/programming-guide/index.html | 37 +++---
 .../documentation/sdks/python-custom-io/index.html | 32 +++
 2 files changed, 36 insertions(+), 33 deletions(-)

diff --git a/content/documentation/programming-guide/index.html 
b/content/documentation/programming-guide/index.html
index 6db8a56..034c0ca 100644
--- a/content/documentation/programming-guide/index.html
+++ b/content/documentation/programming-guide/index.html
@@ -457,18 +457,14 @@ data within your driver program. From there, PCo
 outputs for each step in your pipeline.
   
   
-Transform: A Transform represents a data processing 
operation, or a step,
-in your pipeline. Every Transform takes 
one or more PCollection objects as
+PTransform: A PTransform represents a data processing 
operation, or a step,
+in your pipeline. Every PTransform 
takes one or more PCollection objects as
 input, performs a processing function that you provide on the elements of that
-PCollection, and produces one or more 
output PCollection objects.
+PCollection, and produces zero or more 
output PCollection objects.
   
   
-I/O Source and Sink: Beam provides Source and Sink APIs to represent
-reading and writing data, respectively. Source encapsulates the code
-necessary to read data into your Beam pipeline from some external source, such
-as cloud file storage or a subscription to a streaming data source. Sink
-likewise encapsulates the code necessary to write the elements of a
-PCollection to an external data 
sink.
+I/O transforms: Beam comes with a number of “IOs” - library PTransforms that
+read or write data to various external storage systems.
   
 
 
@@ -477,16 +473,19 @@ likewise encapsulates the code necessary to write the 
elements of a
 
   Create a Pipeline object and set 
the pipeline execution options, including
 the Pipeline Runner.
-  Create an initial PCollection for 
pipeline data, either using the Source
-API to read data from an external source, or using a Create transform to
+  Create an initial PCollection for 
pipeline data, either using the IOs
+to read data from an external storage system, or using a Create transform to
 build a PCollection from in-memory 
data.
-  Apply Transforms to each PCollection. Transforms can change, filter,
+  Apply PTransforms to each PCollection. Transforms can change, filter,
 group, analyze, or otherwise process the elements in a PCollection. A
 transform creates a new output PCollection without consuming the input
 collection. A typical pipeline applies subsequent transforms to the each 
new
-output PCollection in turn until 
processing is complete.
-  Output the final, transformed PCollection(s), typically using the Sink API
-to write data to an external source.
+output PCollection in turn until 
processing is complete. However, note that
+a pipeline does not have to be a single straight line of transforms applied
+one after another: think of PCollections as variables and PTransforms as
+functions applied to these variables: the shape of the pipeline can be an
+arbitrarily complex processing graph.
+  Use IOs to write the final, transformed PCollection(s) to an external source.
   Run the pipeline using the designated Pipeline 
Runner.
 
 
@@ -2237,7 +2236,7 @@ of the multi-collection types for the relevant type 
parameter.
 
 class ComputeWordLengths(beam.PTransform):
   def expand(self, pcoll):
-# transform logic goes here
+# Transform logic goes here.
 return pcoll | beam.Map(lambda x: len(x))
 
 
@@ -2264,7 +2263,7 @@ value.
 
 class ComputeWordLengths(beam.PTransform):
   def expand(self, pcoll):
-# transform logic goes here
+# Transform logic goes here.
 return pcoll | beam.Map(lambda x: len(x))
 
 
@@ -2290,8 +2289,8 @@ is a useful starting point when you want to write new 
composite PTransforms.
 5. Pipeline I/O
 
 When you create a pipeline, you often need to read data from some external
-source, such as a file in external data sink or a database. Likewise, you may
-want your pipeline to output its result data to a similar external data sink.
+source, such as a file or a database. Likewise, you may
+want your pipeline to output its result data to an external storage system.
 Beam provides read and write transforms for a number of common data storage
 types. If you want your pipeline
 to read from or write to a data storage format that isn’t supported by the
diff --git a/content/documentation/sdks/python-custom-io/index.html 
b/content/documentation/sdks/python-custom-io/index.html
index d038b7e..55b0d6e 100644
--- a/content/documentation/sdks/python-custom-io/index.html
+++ b/content/do

[beam-site] branch mergebot updated (db15bed -> a574a33)

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


 discard db15bed  This closes #386
 new a574a33  This closes #386

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (db15bed)
\
 N -- N -- N   refs/heads/mergebot (a574a33)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] 01/01: This closes #386

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit a574a334f593d3ea51f3bc77aed9f8ef863fe18e
Merge: 21477ff a0bc5ff
Author: Mergebot 
AuthorDate: Tue Feb 13 22:22:58 2018 -0800

This closes #386

 src/documentation/programming-guide.md | 33 -
 1 file changed, 16 insertions(+), 17 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


Build failed in Jenkins: beam_PerformanceTests_Python #912

2018-02-13 Thread Apache Jenkins Server
See 


Changes:

[klk] Function interface for Fn API instructions

--
[...truncated 15 B...]
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0bc6f574ce9a9bebcf68946456d4ac6c36e81911 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0bc6f574ce9a9bebcf68946456d4ac6c36e81911
Commit message: "Merge pull request #4629: Functional interface for Fn API 
instructions"
 > git rev-list 08f3ffa67358edfead752acb9b0fc3bf7e7cbe4a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3991272630736357416.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3775174687141872056.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3885346099635916236.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1829921235289751754.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8791032744749066769.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3536919238104151080.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requir

[jira] [Assigned] (BEAM-3697) Add errorprone to maven and gradle builds

2018-02-13 Thread Kenneth Knowles (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-3697:
-

Assignee: Kenneth Knowles

> Add errorprone to maven and gradle builds
> -
>
> Key: BEAM-3697
> URL: https://issues.apache.org/jira/browse/BEAM-3697
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Reporter: Eugene Kirpichov
>Assignee: Kenneth Knowles
>Priority: Major
>
> [http://errorprone.info/] is a good static checker that covers a number of 
> bugs not covered by FindBugs or Checkstyle. We use it internally at Google 
> and, when run on the Beam codebase, it occasionally uncovers issues missed 
> during PR review process.
>  
> It has Maven and Gradle plugins:
> [http://errorprone.info/docs/installation]
> [https://github.com/tbroyer/gradle-errorprone-plugin]
>  
> It would be good to integrate it into our Maven and Gradle builds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Python_ValidatesContainer_Dataflow #36

2018-02-13 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #5958

2018-02-13 Thread Apache Jenkins Server
See 




[beam-site] 01/01: This closes #386

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a commit to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git

commit db15bedd359b590ee90a53f00917fecc741dbd31
Merge: 21477ff a0bc5ff
Author: Mergebot 
AuthorDate: Tue Feb 13 20:42:45 2018 -0800

This closes #386

 src/documentation/programming-guide.md | 33 -
 1 file changed, 16 insertions(+), 17 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


[beam-site] branch mergebot updated (d910f9b -> db15bed)

2018-02-13 Thread mergebot-role
This is an automated email from the ASF dual-hosted git repository.

mergebot-role pushed a change to branch mergebot
in repository https://gitbox.apache.org/repos/asf/beam-site.git.


from d910f9b  This closes #380
 add 21477ff  Prepare repository for deployment.
 add a0bc5ff  Replaces sources/sinks in programming guide with IOs
 new db15bed  This closes #386

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../2016/10/11/strata-hadoop-world-and-beam.html   |   2 +-
 content/blog/2016/10/20/test-stream.html   |   4 +-
 content/documentation/io/testing/index.html|   2 +-
 content/documentation/programming-guide/index.html |   7 +-
 content/feed.xml   |   4 +-
 .../get-started/mobile-gaming-example/index.html   | 193 +++--
 src/documentation/programming-guide.md |  33 ++--
 7 files changed, 129 insertions(+), 116 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
mergebot-r...@apache.org.


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #35

2018-02-13 Thread Apache Jenkins Server
See 


Changes:

[klk] Function interface for Fn API instructions

--
[...truncated 89.26 KB...]
copying apache_beam/runners/common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/sdf_common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/dataflow/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/ptransform_overrides.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/template_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/test_dataflow_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/internal/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/apiclient.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/apiclient_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/dependency.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/dependency_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/names.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/clients/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients
copying apache_beam/runners/dataflow/internal/clients/dataflow/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py
 -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_runner_test.py -> 
apache-bea

Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #34

2018-02-13 Thread Apache Jenkins Server
See 


--
[...truncated 92.40 KB...]
copying apache_beam/io/gcp/tests/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/io/gcp/tests
copying apache_beam/io/gcp/tests/bigquery_matcher.py -> 
apache-beam-2.4.0.dev0/apache_beam/io/gcp/tests
copying apache_beam/io/gcp/tests/bigquery_matcher_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/io/gcp/tests
copying apache_beam/io/gcp/tests/utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/io/gcp/tests
copying apache_beam/io/gcp/tests/utils_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/io/gcp/tests
copying apache_beam/metrics/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/cells.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/cells_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/execution.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/execution.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/execution_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/metric.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/metric_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/metrics/metricbase.py -> 
apache-beam-2.4.0.dev0/apache_beam/metrics
copying apache_beam/options/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/pipeline_options.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/pipeline_options_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/pipeline_options_validator.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/pipeline_options_validator_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/value_provider.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/options/value_provider_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/options
copying apache_beam/portability/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability
copying apache_beam/portability/common_urns.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability
copying apache_beam/portability/python_urns.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability
copying apache_beam/portability/api/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_artifact_api_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_artifact_api_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_fn_api_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_fn_api_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_job_api_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_job_api_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_provision_api_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_provision_api_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_runner_api_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/beam_runner_api_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/endpoints_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/endpoints_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/standard_window_fns_pb2.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/portability/api/standard_window_fns_pb2_grpc.py -> 
apache-beam-2.4.0.dev0/apache_beam/portability/api
copying apache_beam/runners/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/sdf_common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying a

[jira] [Commented] (BEAM-3690) Dependency Conflict problems: several conflicting classes exist in different JARs

2018-02-13 Thread PandaMonkey (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3690?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16363396#comment-16363396
 ] 

PandaMonkey commented on BEAM-3690:
---

[~lcwik], Thx for your quick reply. To keep the consistency of dependencies in 
gradle and Maven, should I submit a pull request to use mockito-core instead of 
mockito-all in pom?

> Dependency Conflict problems: several conflicting classes exist in different 
> JARs
> -
>
> Key: BEAM-3690
> URL: https://issues.apache.org/jira/browse/BEAM-3690
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 2.2.0
>Reporter: PandaMonkey
>Assignee: PandaMonkey
>Priority: Major
> Fix For: 2.3.0
>
> Attachments: beam_conflicts.txt
>
>
> Hi, we found that there are duplicate classes exist in different JARs, and 
> these classes have different features.
> The conflicting JAR pairs are:
> 1. 
> jar-pair:
> 2. 
> jar-pair:
> Some of method only exist in one version of duplicate classes.
> As the JVM only load the classes present first on the classpath and shadow 
> the other duplicate ones with the same names. The dependency conflict problem 
> brings high risks of "*NoSuchMethodException*" or "*NoSuchMethodError*"  
> issues at runtime. The conflicting details are listed in the attachment. 
> Please notice that. Thanks.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #911

2018-02-13 Thread Apache Jenkins Server
See 


Changes:

[cclauss] xrange() was removed in Python 3 (en masse)

[cclauss] from six import integer_types (en masse)

[herohde] Also ignore alternative path for gogradle thrift location

[herohde] Remove gogradle manual dependency ordering

[herohde] Lock Go dependency versions

[herohde] Ignore gogradle.lock in rat check

[herohde] Ignore gogradle.lock in rat check for maven

[herohde] Remove bad gogradle.lock files

[dkulp] [BEAM-3581] Make sure calcite gets an appropriate charset PRIOR to any

[lcwik] [BEAM-3626] Add a handler capable of executing a window mapping fn on a

[lcwik] [BEAM-3339] Fix failing post-release test by running groovy from gradle,

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 08f3ffa67358edfead752acb9b0fc3bf7e7cbe4a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 08f3ffa67358edfead752acb9b0fc3bf7e7cbe4a
Commit message: "Merge pull request #4570 from [BEAM-1251] Migrate away from 
xrange()"
 > git rev-list 1488cb90a69ee2bf3adc43f7157ba3641ed3e04d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4460490525104742239.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4497488026493015632.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7508811720157752298.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins39080422363089402.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3128135787629535571.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4433935954131607820.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarke

[jira] [Commented] (BEAM-2591) Python shim for submitting to FlinkRunner

2018-02-13 Thread Thomas Weise (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-2591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16363360#comment-16363360
 ] 

Thomas Weise commented on BEAM-2591:


Implementation in my mind related more to the operational side of things (extra 
JVM process that needs to be maintained just for launching the Beam pipelines). 
And yes, it would probably make sense to embed that into whatever pipeline 
management solution exists natively for the runner. 

> Python shim for submitting to FlinkRunner
> -
>
> Key: BEAM-2591
> URL: https://issues.apache.org/jira/browse/BEAM-2591
> Project: Beam
>  Issue Type: Sub-task
>  Components: runner-flink, sdk-py-core
>Reporter: Kenneth Knowles
>Priority: Major
>  Labels: portability
>
> Whatever the result of https://s.apache.org/beam-job-api, Python users will 
> need to be able to pass --runner=FlinkRunner and have it work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #33

2018-02-13 Thread Apache Jenkins Server
See 


Changes:

[cclauss] xrange() was removed in Python 3 (en masse)

[cclauss] from six import integer_types (en masse)

[dkulp] [BEAM-3581] Make sure calcite gets an appropriate charset PRIOR to any

[lcwik] [BEAM-3339] Fix failing post-release test by running groovy from gradle,

--
[...truncated 94.91 KB...]
copying apache_beam/runners/common.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/common_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/pipeline_context_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/sdf_common.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners
copying apache_beam/runners/dataflow/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_metrics_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/dataflow_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/ptransform_overrides.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/template_runner_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/test_dataflow_runner.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow
copying apache_beam/runners/dataflow/internal/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/apiclient.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/apiclient_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/dependency.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/dependency_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/names.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal
copying apache_beam/runners/dataflow/internal/clients/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients
copying apache_beam/runners/dataflow/internal/clients/dataflow/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_messages.py
 -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying 
apache_beam/runners/dataflow/internal/clients/dataflow/message_matchers_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/internal/clients/dataflow
copying apache_beam/runners/dataflow/native_io/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/iobase_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/dataflow/native_io/streaming_create.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/dataflow/native_io
copying apache_beam/runners/direct/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/bundle_factory.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/clock.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor.py -> 
apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/consumer_tracking_pipeline_visitor_test.py 
-> apache-beam-2.4.0.dev0/apache_beam/runners/direct
copying apache_beam/runners/direct/direct_metrics.py -> 
apache-beam-2.4.0.dev0/apache_beam

[beam] 01/01: Merge pull request #4570 from [BEAM-1251] Migrate away from xrange()

2018-02-13 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 08f3ffa67358edfead752acb9b0fc3bf7e7cbe4a
Merge: 814a80a c635c63
Author: Robert Bradshaw 
AuthorDate: Tue Feb 13 16:02:27 2018 -0800

Merge pull request #4570 from [BEAM-1251] Migrate away from xrange()

[BEAM-1251] Migrate away from xrange()

 sdks/python/apache_beam/examples/complete/estimate_pi.py   |  2 +-
 .../examples/complete/juliaset/juliaset/juliaset.py|  2 +-
 .../apache_beam/examples/cookbook/bigquery_side_input.py   |  2 +-
 sdks/python/apache_beam/examples/snippets/snippets.py  |  2 +-
 sdks/python/apache_beam/io/filebasedsink.py|  8 
 sdks/python/apache_beam/io/filebasedsource_test.py | 10 +-
 sdks/python/apache_beam/io/tfrecordio_test.py  |  4 ++--
 sdks/python/apache_beam/runners/worker/opcounters_test.py  |  8 
 sdks/python/apache_beam/transforms/combiners_test.py   |  4 ++--
 sdks/python/apache_beam/transforms/trigger_test.py |  2 +-
 sdks/python/apache_beam/typehints/native_type_compatibility.py |  2 +-
 11 files changed, 23 insertions(+), 23 deletions(-)

diff --cc sdks/python/apache_beam/io/tfrecordio_test.py
index 5bc13ce,191c757..c540cfa
--- a/sdks/python/apache_beam/io/tfrecordio_test.py
+++ b/sdks/python/apache_beam/io/tfrecordio_test.py
@@@ -195,202 -203,127 +195,202 @@@ class TestTFRecordSink(unittest.TestCas
  class TestWriteToTFRecord(TestTFRecordSink):
  
def test_write_record_gzip(self):
 -file_path_prefix = os.path.join(self._new_tempdir(), 'result')
 -with TestPipeline() as p:
 -  input_data = ['foo', 'bar']
 -  _ = p | beam.Create(input_data) | WriteToTFRecord(
 -  file_path_prefix, compression_type=CompressionTypes.GZIP)
 -
 -actual = []
 -file_name = glob.glob(file_path_prefix + '-*')[0]
 -for r in tf.python_io.tf_record_iterator(
 -file_name, options=tf.python_io.TFRecordOptions(
 -tf.python_io.TFRecordCompressionType.GZIP)):
 -  actual.append(r)
 -self.assertEqual(actual, input_data)
 +with TempDir() as temp_dir:
 +  file_path_prefix = temp_dir.create_temp_file('result')
 +  with TestPipeline() as p:
 +input_data = ['foo', 'bar']
 +_ = p | beam.Create(input_data) | WriteToTFRecord(
 +file_path_prefix, compression_type=CompressionTypes.GZIP)
 +
 +  actual = []
 +  file_name = glob.glob(file_path_prefix + '-*')[0]
 +  for r in tf.python_io.tf_record_iterator(
 +  file_name, options=tf.python_io.TFRecordOptions(
 +  tf.python_io.TFRecordCompressionType.GZIP)):
 +actual.append(r)
 +  self.assertEqual(actual, input_data)
  
def test_write_record_auto(self):
 -file_path_prefix = os.path.join(self._new_tempdir(), 'result')
 -with TestPipeline() as p:
 -  input_data = ['foo', 'bar']
 -  _ = p | beam.Create(input_data) | WriteToTFRecord(
 -  file_path_prefix, file_name_suffix='.gz')
 +with TempDir() as temp_dir:
 +  file_path_prefix = temp_dir.create_temp_file('result')
 +  with TestPipeline() as p:
 +input_data = ['foo', 'bar']
 +_ = p | beam.Create(input_data) | WriteToTFRecord(
 +file_path_prefix, file_name_suffix='.gz')
  
 -actual = []
 -file_name = glob.glob(file_path_prefix + '-*.gz')[0]
 -for r in tf.python_io.tf_record_iterator(
 -file_name, options=tf.python_io.TFRecordOptions(
 -tf.python_io.TFRecordCompressionType.GZIP)):
 -  actual.append(r)
 -self.assertEqual(actual, input_data)
 +  actual = []
 +  file_name = glob.glob(file_path_prefix + '-*.gz')[0]
 +  for r in tf.python_io.tf_record_iterator(
 +  file_name, options=tf.python_io.TFRecordOptions(
 +  tf.python_io.TFRecordCompressionType.GZIP)):
 +actual.append(r)
 +  self.assertEqual(actual, input_data)
  
  
 -class TestTFRecordSource(_TestCaseWithTempDirCleanUp):
 -
 -  def _write_file(self, path, base64_records):
 -record = binascii.a2b_base64(base64_records)
 -with open(path, 'wb') as f:
 -  f.write(record)
 -
 -  def _write_file_gzip(self, path, base64_records):
 -record = binascii.a2b_base64(base64_records)
 -with gzip.GzipFile(path, 'wb') as f:
 -  f.write(record)
 +class TestReadFromTFRecord(unittest.TestCase):
  
def test_process_single(self):
 -path = os.path.join(self._new_tempdir(), 'result')
 -self._write_file(path, FOO_RECORD_BASE64)
 -with TestPipeline() as p:
 -  result = (p
 -| beam.io.Read(
 -_TFRecordSource(
 -path,
 -coder=coders.BytesCoder(),
 -compression_type=CompressionTypes.AUTO,
 -validate=True)))
 -  asse

[beam] branch master updated (814a80a -> 08f3ffa)

2018-02-13 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 814a80a  Merge pull request #4571 from [BEAM-1251] Use 
six.integer_types
 add c635c63  xrange() was removed in Python 3 (en masse)
 new 08f3ffa  Merge pull request #4570 from [BEAM-1251] Migrate away from 
xrange()

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/examples/complete/estimate_pi.py   |  2 +-
 .../examples/complete/juliaset/juliaset/juliaset.py|  2 +-
 .../apache_beam/examples/cookbook/bigquery_side_input.py   |  2 +-
 sdks/python/apache_beam/examples/snippets/snippets.py  |  2 +-
 sdks/python/apache_beam/io/filebasedsink.py|  8 
 sdks/python/apache_beam/io/filebasedsource_test.py | 10 +-
 sdks/python/apache_beam/io/tfrecordio_test.py  |  4 ++--
 sdks/python/apache_beam/runners/worker/opcounters_test.py  |  8 
 sdks/python/apache_beam/transforms/combiners_test.py   |  4 ++--
 sdks/python/apache_beam/transforms/trigger_test.py |  2 +-
 sdks/python/apache_beam/typehints/native_type_compatibility.py |  2 +-
 11 files changed, 23 insertions(+), 23 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] branch master updated (8207df6 -> 814a80a)

2018-02-13 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 8207df6  Merge pull request #4564: [BEAM-3581] Make sure calcite gets 
an appropriate charset PRIOR to any static initializers
 add e6e875d  from six import integer_types (en masse)
 new 814a80a  Merge pull request #4571 from [BEAM-1251] Use 
six.integer_types

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/io/filebasedsource.py | 6 --
 sdks/python/apache_beam/io/filesystem.py  | 4 +++-
 sdks/python/apache_beam/io/range_trackers.py  | 8 +---
 sdks/python/apache_beam/io/range_trackers_test.py | 5 -
 sdks/python/apache_beam/io/textio.py  | 4 +++-
 5 files changed, 19 insertions(+), 8 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[beam] 01/01: Merge pull request #4571 from [BEAM-1251] Use six.integer_types

2018-02-13 Thread robertwb
This is an automated email from the ASF dual-hosted git repository.

robertwb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 814a80a4aa6dcce69d0705bc1506a3c98d0b0817
Merge: 8207df6 e6e875d
Author: Robert Bradshaw 
AuthorDate: Tue Feb 13 16:01:27 2018 -0800

Merge pull request #4571 from [BEAM-1251] Use six.integer_types

from six import integer_types (en masse)

 sdks/python/apache_beam/io/filebasedsource.py | 6 --
 sdks/python/apache_beam/io/filesystem.py  | 4 +++-
 sdks/python/apache_beam/io/range_trackers.py  | 8 +---
 sdks/python/apache_beam/io/range_trackers_test.py | 5 -
 sdks/python/apache_beam/io/textio.py  | 4 +++-
 5 files changed, 19 insertions(+), 8 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
rober...@apache.org.


[jira] [Created] (BEAM-3699) RecordTimestamp should be the default Watermark in KafkaIO

2018-02-13 Thread Xu Mingmin (JIRA)
Xu Mingmin created BEAM-3699:


 Summary: RecordTimestamp should be the default Watermark in KafkaIO
 Key: BEAM-3699
 URL: https://issues.apache.org/jira/browse/BEAM-3699
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Reporter: Xu Mingmin
Assignee: Xu Mingmin
 Fix For: 2.4.0


Currently, the priority to get Watermark Instance in KafkaIO is:

{code}
getWatermarkFn().apply(curRecord)
  getTimestampFn().apply(record)
    Instant.now()
{code}

I would propose to change it as below to leverage {{KafkaRecord.timestamp}} if 
no {{WatermarkFn()}} or {{TimestampFn()}} is available:
{code}
getWatermarkFn().apply(curRecord)
  getTimestampFn().apply(record)
KafkaRecord(Beam.KafkaIO).timestamp
{code}

 It equals to
{code}
getWatermarkFn().apply(curRecord)
  getTimestampFn().apply(record)
KafkaRawRecord(Kafka_client).timestamp
  Instant.now()
{code}

[~rangadi] any comments?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4564: [BEAM-3581] Make sure calcite gets an appropriate charset PRIOR to any static initializers

2018-02-13 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 8207df6f0cce501ef0727fa066b233143043dc8e
Merge: 54c39cd 6bc1ad5
Author: Kenn Knowles 
AuthorDate: Tue Feb 13 15:37:35 2018 -0800

Merge pull request #4564: [BEAM-3581] Make sure calcite gets an appropriate 
charset PRIOR to any static initializers

 sdks/java/extensions/sql/build.gradle |  8 
 sdks/java/extensions/sql/pom.xml  | 13 -
 2 files changed, 20 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[beam] branch master updated (54c39cd -> 8207df6)

2018-02-13 Thread kenn
This is an automated email from the ASF dual-hosted git repository.

kenn pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 54c39cd  [BEAM-3339] Fix failing post-release test by running groovy 
from gradle, instead of as a command line
 add 6bc1ad5  [BEAM-3581] Make sure calcite gets an appropriate charset 
PRIOR to any static intializers
 new 8207df6  Merge pull request #4564: [BEAM-3581] Make sure calcite gets 
an appropriate charset PRIOR to any static initializers

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/java/extensions/sql/build.gradle |  8 
 sdks/java/extensions/sql/pom.xml  | 13 -
 2 files changed, 20 insertions(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
k...@apache.org.


[beam] branch master updated: [BEAM-3339] Fix failing post-release test by running groovy from gradle, instead of as a command line + add apex, flink, and spark local tests

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 951a6dd  [BEAM-3339] Fix failing post-release test by running groovy 
from gradle, instead of as a command line + add apex, flink, and spark local 
tests
 new 54c39cd  [BEAM-3339] Fix failing post-release test by running groovy 
from gradle, instead of as a command line
951a6dd is described below

commit 951a6dd7d7a52b6ef517ebd3cd9bb420ed00b745
Author: Alan Myrvold 
AuthorDate: Wed Feb 7 21:13:48 2018 -0800

[BEAM-3339] Fix failing post-release test by running groovy from gradle, 
instead of as a command line + add apex, flink, and spark local tests
---
 .../job_beam_PostRelease_NightlySnapshot.groovy| 22 +++--
 build_rules.gradle | 40 +-
 release/build.gradle   | 36 +
 .../main/groovy/QuickstartArchetype.groovy}| 32 ++--
 release/{ => src/main/groovy}/TestScripts.groovy   | 93 --
 .../src/main/groovy/quickstart-java-apex.groovy| 45 +++
 .../main/groovy/quickstart-java-dataflow.groovy| 54 +
 .../main/groovy}/quickstart-java-direct.groovy | 31 ++--
 .../main/groovy/quickstart-java-flinklocal.groovy  | 43 ++
 .../src/main/groovy/quickstart-java-spark.groovy   | 43 ++
 runners/apex/build.gradle  |  3 +
 runners/direct-java/build.gradle   |  3 +
 runners/flink/build.gradle |  3 +
 runners/google-cloud-dataflow-java/build.gradle|  5 ++
 runners/spark/build.gradle |  3 +
 settings.gradle|  1 +
 16 files changed, 372 insertions(+), 85 deletions(-)

diff --git a/.test-infra/jenkins/job_beam_PostRelease_NightlySnapshot.groovy 
b/.test-infra/jenkins/job_beam_PostRelease_NightlySnapshot.groovy
index 60abf9e..1da9d2c 100644
--- a/.test-infra/jenkins/job_beam_PostRelease_NightlySnapshot.groovy
+++ b/.test-infra/jenkins/job_beam_PostRelease_NightlySnapshot.groovy
@@ -31,10 +31,10 @@ job('beam_PostRelease_NightlySnapshot') {
 
   parameters {
 stringParam('snapshot_version',
-'2.3.0-SNAPSHOT',
+'',
 'Version of the repository snapshot to install')
 stringParam('snapshot_url',
-'https://repository.apache.org/content/repositories/snapshots',
+'',
 'Repository URL to install from')
   }
 
@@ -42,11 +42,21 @@ job('beam_PostRelease_NightlySnapshot') {
   common_job_properties.setPostCommit(
   delegate,
   '0 11 * * *',
-  false,
-  'd...@beam.apache.org')
+  false)
+
+
+  // Allows triggering this build against pull requests.
+  common_job_properties.enablePhraseTriggeringFromPullRequest(
+  delegate,
+  './gradlew :release:runQuickstartsJava',
+  'Run Dataflow PostRelease')
 
   steps {
-// Run a quickstart from 
https://beam.apache.org/get-started/quickstart-java/
-shell('cd ' + common_job_properties.checkoutDir + '/release && groovy 
quickstart-java-direct.groovy')
+// Run a quickstart from 
https://beam.apache.org/get-started/quickstart-java
+gradle {
+  rootBuildScriptDir(common_job_properties.checkoutDir)
+  tasks(':release:runQuickstartsJava')
+  switches('-Pver=$snapshot_version -Prepourl=$snapshot_url')
+}
   }
 }
diff --git a/build_rules.gradle b/build_rules.gradle
index f7df03e..a8e521a 100644
--- a/build_rules.gradle
+++ b/build_rules.gradle
@@ -38,7 +38,7 @@ println "Applying build_rules.gradle to $project.name"
 // We use the project.path as the group name to make this mapping unique since
 // we have a few projects with the same name.
 group = project.path
-version = "2.3.0-SNAPSHOT"
+version = "2.4.0-SNAPSHOT"
 
 // Define the default set of repositories for all builds.
 repositories {
@@ -484,3 +484,41 @@ ext.applyAvroNature = {
   println "applyAvroNature with " + (it ? "$it" : "default configuration") + " 
for project $project.name"
   apply plugin: "com.commercehub.gradle.plugin.avro"
 }
+
+// A class defining the set of configurable properties for 
createJavaQuickstartValidationTask
+class JavaQuickstartConfiguration {
+  // Name for the quickstart is required.
+  // Used both for the test name runQuickstartJava${name}
+  // and also for the script name, quickstart-java-${name}.toLowerCase().
+  String name
+
+  // gcpProject sets the gcpProject argument when executing the quickstart.
+  String gcpProject
+
+  // gcsBucket sets the gcsProject argument when executing the quickstart.
+  String gcsBucket
+}
+
+// Creates a task to run the quickstart for a runner.
+// Releases version and URL, can be overriden for a RC release with
+// ./gradlew :release:runQuickstartJava -Pver=2.3.0 
-Prepourl=https://rep

Jenkins build became unstable: beam_PostCommit_Java_MavenInstall #5956

2018-02-13 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #32

2018-02-13 Thread Apache Jenkins Server
e-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
Writing apache-beam-2.4.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.4.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nocapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
--runner=TestDataflowRunner \
--project=$PROJECT \
--worker_harness_container_image=$CONTAINER:$TAG \
--staging_location=$GCS_LOCATION/staging-validatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 404.136s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180213-214812
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:b9bed2b7ca5e51caee16ae06f288e890e00e

[jira] [Resolved] (BEAM-3626) Support remapping the main input window to side input window inside the Java SDK harness

2018-02-13 Thread Luke Cwik (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Luke Cwik resolved BEAM-3626.
-
   Resolution: Fixed
Fix Version/s: 2.4.0

> Support remapping the main input window to side input window inside the Java 
> SDK harness
> 
>
> Key: BEAM-3626
> URL: https://issues.apache.org/jira/browse/BEAM-3626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-harness
>Reporter: Luke Cwik
>Assignee: Luke Cwik
>Priority: Major
>  Labels: portability
> Fix For: 2.4.0
>
>  Time Spent: 2h 40m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Merge pull request #4654: CoGBK fixup - rename file

2018-02-13 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a commit to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git

commit e2d7408d916b3edebcb118decb682f5313e8011d
Merge: 5cb4ac6 8e8d5fb
Author: Chamikara Jayalath 
AuthorDate: Tue Feb 13 12:59:47 2018 -0800

Merge pull request #4654: CoGBK fixup - rename file

 sdks/go/pkg/beam/core/runtime/graphx/{cobgk.go => cogbk.go} | 0
 1 file changed, 0 insertions(+), 0 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] branch go-sdk updated (5cb4ac6 -> e2d7408)

2018-02-13 Thread chamikara
This is an automated email from the ASF dual-hosted git repository.

chamikara pushed a change to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 5cb4ac6  Dot rendering improvement
 add 8e8d5fb  Fixing filename.
 new e2d7408  Merge pull request #4654: CoGBK fixup - rename file

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/go/pkg/beam/core/runtime/graphx/{cobgk.go => cogbk.go} | 0
 1 file changed, 0 insertions(+), 0 deletions(-)
 rename sdks/go/pkg/beam/core/runtime/graphx/{cobgk.go => cogbk.go} (100%)

-- 
To stop receiving notification emails like this one, please contact
chamik...@apache.org.


[beam] branch master updated (de425e3 -> 350aed5)

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from de425e3  [BEAM-3457] Improve Go gradle setup
 add 346acf5  [BEAM-3626] Add a handler capable of executing a window 
mapping fn on a stream of windows.
 new 350aed5  [BEAM-3626] Add a handler capable of executing a window 
mapping fn on a stream of windows.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../resources/org/apache/beam/model/common_urns.md |   1 +
 .../org/apache/beam/fn/harness/MapFnRunner.java| 120 +
 .../beam/fn/harness/WindowMappingFnRunner.java |  67 
 .../apache/beam/fn/harness/MapFnRunnerTest.java|  97 +
 .../beam/fn/harness/WindowMappingFnRunnerTest.java |  68 
 5 files changed, 353 insertions(+)
 create mode 100644 
sdks/java/harness/src/main/java/org/apache/beam/fn/harness/MapFnRunner.java
 create mode 100644 
sdks/java/harness/src/main/java/org/apache/beam/fn/harness/WindowMappingFnRunner.java
 create mode 100644 
sdks/java/harness/src/test/java/org/apache/beam/fn/harness/MapFnRunnerTest.java
 create mode 100644 
sdks/java/harness/src/test/java/org/apache/beam/fn/harness/WindowMappingFnRunnerTest.java

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] 01/01: [BEAM-3626] Add a handler capable of executing a window mapping fn on a stream of windows.

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 350aed58bb72f3988ae5ed0c03b4919a1daa0551
Merge: de425e3 346acf5
Author: Lukasz Cwik 
AuthorDate: Tue Feb 13 12:58:39 2018 -0800

[BEAM-3626] Add a handler capable of executing a window mapping fn on a 
stream of windows.

 .../resources/org/apache/beam/model/common_urns.md |   1 +
 .../org/apache/beam/fn/harness/MapFnRunner.java| 120 +
 .../beam/fn/harness/WindowMappingFnRunner.java |  67 
 .../apache/beam/fn/harness/MapFnRunnerTest.java|  97 +
 .../beam/fn/harness/WindowMappingFnRunnerTest.java |  68 
 5 files changed, 353 insertions(+)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[jira] [Resolved] (BEAM-3646) Add comments about appropriate use of DoFn.Teardown

2018-02-13 Thread Thomas Groh (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3646?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Groh resolved BEAM-3646.
---
   Resolution: Fixed
Fix Version/s: 2.4.0

> Add comments about appropriate use of DoFn.Teardown
> ---
>
> Key: BEAM-3646
> URL: https://issues.apache.org/jira/browse/BEAM-3646
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Critical
> Fix For: 2.4.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> Because the {{Teardown}} method has no relation to the atomicity of 
> processing and commiting of output, it is EXTREMELY DANGEROUS to use to flush 
> outputs, and buffered data there is extremely likely to never be flushed. If 
> a DoFn instance with buffered data is lost (for example, via worker/machine 
> failure), and the runner has committed the result of processing that input, 
> the data is lost.
>  
> Not commenting on this being the case can cause users to believe that 
> (especially if running a batch pipeline) that their data will be flushed on 
> pipeline completion. This is very dangerous behavior that we do not warn of 
> sufficiently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] 01/01: Dot rendering improvement

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 5cb4ac6e71b3b41eed9c93f548d4b6e26a3ac55c
Merge: 8cb9500 0dcbb0b
Author: Lukasz Cwik 
AuthorDate: Tue Feb 13 12:54:41 2018 -0800

Dot rendering improvement

 sdks/go/pkg/beam/core/util/dot/dot.go | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch go-sdk updated (8cb9500 -> 5cb4ac6)

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch go-sdk
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 8cb9500  Merge pull request #4624
 add 0dcbb0b  Improve rendering of DOT diagrams.
 new 5cb4ac6  Dot rendering improvement

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/go/pkg/beam/core/util/dot/dot.go | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] 01/01: [BEAM-3457] Improve Go gradle setup

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit de425e3c2018b2fb17141330eb6fd232481eb8a3
Merge: 1488cb9 785c9f0
Author: Lukasz Cwik 
AuthorDate: Tue Feb 13 12:52:32 2018 -0800

[BEAM-3457] Improve Go gradle setup

 build.gradle   |   1 +
 build_rules.gradle |  29 +-
 pom.xml|   1 +
 runners/gcp/gcemd/build.gradle |  14 +
 runners/gcp/gcsproxy/build.gradle  |  14 +
 sdks/go/gogradle.lock  | 697 +
 sdks/java/container/build.gradle   |   1 +
 sdks/python/container/build.gradle |   1 +
 8 files changed, 732 insertions(+), 26 deletions(-)


-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


[beam] branch master updated (1488cb9 -> de425e3)

2018-02-13 Thread lcwik
This is an automated email from the ASF dual-hosted git repository.

lcwik pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 1488cb9  Merge pull request #4637
 add c1f2017  Also ignore alternative path for gogradle thrift location
 add af5b010  Remove gogradle manual dependency ordering
 add 4c6e0c0  Lock Go dependency versions
 add d78c6f5  Ignore gogradle.lock in rat check
 add 6d3aa9e  Ignore gogradle.lock in rat check for maven
 add 785c9f0  Remove bad gogradle.lock files
 new de425e3  [BEAM-3457] Improve Go gradle setup

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build.gradle   |   1 +
 build_rules.gradle |  29 +-
 pom.xml|   1 +
 runners/gcp/gcemd/build.gradle |  14 +
 runners/gcp/gcsproxy/build.gradle  |  14 +
 sdks/go/gogradle.lock  | 697 +
 sdks/java/container/build.gradle   |   1 +
 sdks/python/container/build.gradle |   1 +
 8 files changed, 732 insertions(+), 26 deletions(-)
 create mode 100644 sdks/go/gogradle.lock

-- 
To stop receiving notification emails like this one, please contact
lc...@apache.org.


Build failed in Jenkins: beam_PostRelease_NightlySnapshot #45

2018-02-13 Thread Apache Jenkins Server
See 


--
[...truncated 3.14 MB...]
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching spark://127.0.0.1:42408/jars/chill-java-0.8.0.jar with timestamp 
1518554640638
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logError
SEVERE: Exception in task 2.0 in stage 0.0 (TID 2)
java.io.IOException: Connection from /127.0.0.1:42408 closed
at 
org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
at 
org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at 
io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:278)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at 
org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at 
io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1329)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at 
io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:908)
at 
io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:744)
at 
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:445)
at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)

Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 2.0 in stage 0.0 (TID 2) on localhost, executor driver: 
java.io.IOException (Connection from /127.0.0.1:42408 closed) [duplicate 2]
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Feb 13, 2018 8:44:04 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Found inactive connection to /127.0.0.1:42408, creating a new one.
Feb 13, 2018 8:44:04 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Successfully created connection to /127.0.0.1:42408 after 29 ms (0 ms 
spent in bootstraps)
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching spark://127.0.0.1:42408/jars/chill-java-0.8.0.jar to 
/tmp/spark-7667752e-4151-4200-b918-edecb77a6aa4/userFiles-3b7d76b4-9e6c-4753-b027-5fe77a7949f2/fetchFileTemp2334279996979975636.tmp
Feb 13, 2018 8:44:04 PM org.apache.spark.network.server

[jira] [Closed] (BEAM-65) SplittableDoFn

2018-02-13 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-65?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov closed BEAM-65.

   Resolution: Fixed
Fix Version/s: 2.2.0

SDF has been available in the Beam model and implemented in Direct and Dataflow 
streaming runner for a while. Other work can be tracked separately - there is 
no clear completion criterion for the current Jira justifying keeping it open.

> SplittableDoFn
> --
>
> Key: BEAM-65
> URL: https://issues.apache.org/jira/browse/BEAM-65
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model
>Reporter: Daniel Halperin
>Assignee: Eugene Kirpichov
>Priority: Minor
> Fix For: 2.2.0
>
>
> SplittableDoFn is a proposed enhancement for "dynamically splittable work" to 
> the Beam model.
> Among other things, it would allow a unified implementation of 
> bounded/unbounded sources with dynamic work rebalancing and the ability to 
> express multiple scalable steps (e.g., global expansion -> file sizing & 
> parsing -> splitting files into independently-processable blocks) via 
> composition rather than inheritance.
> This would make it much easier to implement many types of sources, to modify 
> and reuse existing sources. Also, it would improve scalability of the Beam 
> model by moving things like splitting a source from the control plane (where 
> it is today -- glob -> List sent over service APIs) into the 
> data plane (PCollection -> PCollection -> ...).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-2607) Enforce that SDF must return stop() after a failed tryClaim() call

2018-02-13 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-2607?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov closed BEAM-2607.
--
   Resolution: Fixed
Fix Version/s: 2.3.0

This was recently fixed.

> Enforce that SDF must return stop() after a failed tryClaim() call
> --
>
> Key: BEAM-2607
> URL: https://issues.apache.org/jira/browse/BEAM-2607
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>Priority: Major
> Fix For: 2.3.0
>
>
> https://github.com/apache/beam/pull/3360 reintroduces 
> DoFn.ProcessContinuation with some refinements to its semantics - see 
> https://issues.apache.org/jira/browse/BEAM-2447.
> One of the refinements is that, if the ProcessElement call unsuccessfully 
> calls tryClaim() on the RestrictionTracker, the call MUST return stop().
> The current JIRA is to enforce this automatically. Right now this is not 
> possible because tryClaim() is not formally a method in RestrictionTracker 
> (only concrete classes provide it, but not the base class) and runners can 
> not hook into it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3698) Support SDF over Fn API

2018-02-13 Thread Eugene Kirpichov (JIRA)
Eugene Kirpichov created BEAM-3698:
--

 Summary: Support SDF over Fn API
 Key: BEAM-3698
 URL: https://issues.apache.org/jira/browse/BEAM-3698
 Project: Beam
  Issue Type: Bug
  Components: beam-model, runner-core
Reporter: Eugene Kirpichov
Assignee: Eugene Kirpichov


SDF is the only API for emitting unbounded data in Fn API. This issue is about 
supporting SDF in an SDK harness, with its checkpointing controlled by the 
runner harness.

++Umbrella issue: includes design and implementation in various SDK languages.

First practically useful goal in sight: Kafka source implemented in the Python 
SDK, running over Fn API using the Dataflow runner.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3698) Support SDF over Fn API

2018-02-13 Thread Eugene Kirpichov (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16362952#comment-16362952
 ] 

Eugene Kirpichov commented on BEAM-3698:


CC: [~herohde]

> Support SDF over Fn API
> ---
>
> Key: BEAM-3698
> URL: https://issues.apache.org/jira/browse/BEAM-3698
> Project: Beam
>  Issue Type: Bug
>  Components: beam-model, runner-core
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>Priority: Major
>
> SDF is the only API for emitting unbounded data in Fn API. This issue is 
> about supporting SDF in an SDK harness, with its checkpointing controlled by 
> the runner harness.
> ++Umbrella issue: includes design and implementation in various SDK languages.
> First practically useful goal in sight: Kafka source implemented in the 
> Python SDK, running over Fn API using the Dataflow runner.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to stable : beam_PostCommit_Java_MavenInstall #5955

2018-02-13 Thread Apache Jenkins Server
See 




[jira] [Assigned] (BEAM-3697) Add errorprone to maven and gradle builds

2018-02-13 Thread Davor Bonaci (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Davor Bonaci reassigned BEAM-3697:
--

Assignee: (was: Davor Bonaci)

> Add errorprone to maven and gradle builds
> -
>
> Key: BEAM-3697
> URL: https://issues.apache.org/jira/browse/BEAM-3697
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Reporter: Eugene Kirpichov
>Priority: Major
>
> [http://errorprone.info/] is a good static checker that covers a number of 
> bugs not covered by FindBugs or Checkstyle. We use it internally at Google 
> and, when run on the Beam codebase, it occasionally uncovers issues missed 
> during PR review process.
>  
> It has Maven and Gradle plugins:
> [http://errorprone.info/docs/installation]
> [https://github.com/tbroyer/gradle-errorprone-plugin]
>  
> It would be good to integrate it into our Maven and Gradle builds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3697) Add errorprone to maven and gradle builds

2018-02-13 Thread Eugene Kirpichov (JIRA)
Eugene Kirpichov created BEAM-3697:
--

 Summary: Add errorprone to maven and gradle builds
 Key: BEAM-3697
 URL: https://issues.apache.org/jira/browse/BEAM-3697
 Project: Beam
  Issue Type: Bug
  Components: build-system
Reporter: Eugene Kirpichov
Assignee: Davor Bonaci


[http://errorprone.info/] is a good static checker that covers a number of bugs 
not covered by FindBugs or Checkstyle. We use it internally at Google and, when 
run on the Beam codebase, it occasionally uncovers issues missed during PR 
review process.

 

It has Maven and Gradle plugins:

[http://errorprone.info/docs/installation]

[https://github.com/tbroyer/gradle-errorprone-plugin]

 

It would be good to integrate it into our Maven and Gradle builds.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-3647) Default Coder/Reading Coder From File

2018-02-13 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov closed BEAM-3647.
--
   Resolution: Duplicate
Fix Version/s: 2.1.0

I believe this is a duplicate of BEAM-3615 that you filed recently and where I 
provided a response that I think solves your problem. Please reopen the current 
issue or that one if you believe that my suggestion does not solve your 
problem, or comment here or ask a question on StackOverflow or the user@ 
mailing list if it is unclear how to use it.

> Default Coder/Reading Coder From File 
> --
>
> Key: BEAM-3647
> URL: https://issues.apache.org/jira/browse/BEAM-3647
> Project: Beam
>  Issue Type: New Feature
>  Components: beam-model, dsl-sql
>Affects Versions: 2.2.0
>Reporter: Kishan Kumar
>Priority: Major
> Fix For: 2.1.0
>
>
> *Requirement*-: Need to Run Template With Same Logics on Different Tables 
> Data.(Example is Given Below)
>  
> *Need*: Default Coder is Required So According to Data It Make All Fields as 
> String and Read Data else Thier must be Dynamic Options to Read Coder From 
> GCS as JSON FILE and Parse Data on Basis of That (But We can Pass Location 
> Using ValueProvider) or SomeWhere Else so At Runtime Using ValueProvider.
>  
>  
> *Examples*: I Have Two Tables 1 is Having Column (NAME, CLASS, ROLL, 
> SUB_PRICE)
> And 2 Table is (NAME, ROLL, SUB, TEST_MARKS)
>  
> On Both Tables, I am Just Sorting Table on Basis Of Roll Number so if We can 
> Read Coder at Run Time The Same Template Can Be Used For Different Tables at 
> Run Time.
>  
> Such Situations Make Our Work Easy and Make Our job Easy.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-3696) MQTT IO should compute watermark and ack messages outside of finalizeCheckpoint method

2018-02-13 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov reassigned BEAM-3696:
--

Assignee: Jean-Baptiste Onofré  (was: Reuven Lax)

> MQTT IO should compute watermark and ack messages outside of 
> finalizeCheckpoint method
> --
>
> Key: BEAM-3696
> URL: https://issues.apache.org/jira/browse/BEAM-3696
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Affects Versions: 2.2.0
> Environment: - Flink - beam-runners-flink_2.10:2.2.0
> - Beam and related jars - 2.2.0
>Reporter: Maxim Kolchin
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>
> I'm experiencing a situation when an incoming message isn't acknowledged 
> (therefore in sometime broker resend it) and the watermark is not updated 
> while new messages are coming continuously.
> After some time I've discovered that this situation is related to the fact 
> that finalizaCheckpoint is not being called.
> I took a look at the Pubsub IO implementation and found that they expect such 
> situation and do not compute watermark and ack messages in 
> finalizeCheckpoint. Here is the comment about that: 
> [https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubUnboundedSource.java#L289]
> Should MQTT IO do the same?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-3622) DirectRunner memory issue with Python SDK

2018-02-13 Thread Charles Chen (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-3622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16362826#comment-16362826
 ] 

Charles Chen commented on BEAM-3622:


Hey Yuri, thank you for reporting this issue.  It does look like there's a 
memory leak of Bundle objects.  I will take a look.

> DirectRunner memory issue with Python SDK
> -
>
> Key: BEAM-3622
> URL: https://issues.apache.org/jira/browse/BEAM-3622
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
>Reporter: yuri krnr
>Assignee: Charles Chen
>Priority: Major
>
> After running pipeline for a while in a streaming mode (reading from Pub/Sub 
> and writing to BigQuery, Datastore and another Pub/Sub) I noticed drastic 
> memory usage of a process. Using guppy as a profiler I got the following 
> results:
> start
> {noformat}
>  INFO *** MemoryReport Heap:
>  Partition of a set of 240208 objects. Total size = 34988840 bytes.
>  Index  Count   % Size   % Cumulative  % Kind (class / dict of class)
>  0  88289  37  8696984  25   8696984  25 str
>  1  5  22  4897352  14  13594336  39 tuple
>  2   5083   2  2790664   8  16385000  47 dict (no owner)
>  3   1939   1  1749656   5  18134656  52 type
>  4699   0  1723272   5  19857928  57 dict of module
>  5  12337   5  1579136   5  21437064  61 types.CodeType
>  6  12403   5  1488360   4  22925424  66 function
>  7   1939   1  1452616   4  24378040  70 dict of type
>  8677   0   709496   2  25087536  72 dict of 0x1e4d880
>  9  25603  11   614472   2  25702008  73 int
> <1103 more rows. Type e.g. '_.more' to view.>
> {noformat}
> after several hours of running
> {noformat}
> INFO *** MemoryReport Heap:
>  Partition of a set of 1255662 objects. Total size = 315029632 bytes.
>  Index  Count   % Size   % Cumulative  % Kind (class / dict of class)
>  0  95554   8 99755056  32  99755056  32 dict of
>  
> apache_beam.runners.direct.bundle_factory._Bundle
>  1 117943   9 54193192  17 153948248  49 dict (no owner)
>  2 161068  13 27169296   9 181117544  57 unicode
>  3  94571   8 26479880   8 207597424  66 dict of apache_beam.pvalue.PBegin
>  4 126461  10 12715336   4 220312760  70 str
>  5  44374   4 12424720   4 232737480  74 dict of 
> apitools.base.protorpclite.messages.FieldList
>  6  44374   4  6348624   2 239086104  76 
> apitools.base.protorpclite.messages.FieldList
>  7  95556   8  6115584   2 245201688  78 
> apache_beam.runners.direct.bundle_factory._Bundle
>  8  94571   8  6052544   2 251254232  80 apache_beam.pvalue.PBegin
>  9  57371   5  5218424   2 256472656  81 tuple
> <1187 more rows. Type e.g. '_.more' to view.>
> {noformat}
>  
> I see that every bundle still sits in memory and all its data too. why aren't 
> the gc-ed?
> What is the policy for gc for the dataflow processes?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_Python #910

2018-02-13 Thread Apache Jenkins Server
See 


Changes:

[tgroh] Add Javadoc on how Teardown is best-effort

[aromanenko.dev] [BEAM-3637] HBaseIOTest - random table names for every test

--
[...truncated 502 B...]
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1488cb90a69ee2bf3adc43f7157ba3641ed3e04d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1488cb90a69ee2bf3adc43f7157ba3641ed3e04d
Commit message: "Merge pull request #4637"
 > git rev-list c14fab0f66374e572e5b0681fe3b652dff6185de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1692649245349387994.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3105326470608179786.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3295457143109017029.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4335577843083099762.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe /tmp/jenkins980100382995723754.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6346206865245695054.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
:318:
 SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name 
Indication) extension to TLS is not available on this platform. This may cause 
the server to present an incorrect TLS certificate, which can cause validation 
failures. You can upgrade to a newer version of Python to solve this. For more 
information, see 
https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5954

2018-02-13 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #31

2018-02-13 Thread Apache Jenkins Server
-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
Writing apache-beam-2.4.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.4.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nocapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
--runner=TestDataflowRunner \
--project=$PROJECT \
--worker_harness_container_image=$CONTAINER:$TAG \
--staging_location=$GCS_LOCATION/staging-validatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 419.047s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180213-162112
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:0579ef4c85762a0320e553aca4b6585fd283e98bce4203a1f4466c2756673144
Deleted: sha256:b19a770ae4dce64e8d70b07f494f250a8feef5bb672c34e114c3b0541fc2598d
Deleted: sha256:1d03b645237384ea7f013be218220b240d3827891075dd28058c0dab5ab6e29d
Deleted: sha256:e59ad756d94cc39e9f14fc8ef592c97508bb753db64360424debcfc

Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5953

2018-02-13 Thread Apache Jenkins Server
See 




[beam] branch release-2.3.0 updated: [BEAM-3692] Remove maven deploy plugin configuration with skip in the hadoop-input-format IO module

2018-02-13 Thread jbonofre
This is an automated email from the ASF dual-hosted git repository.

jbonofre pushed a commit to branch release-2.3.0
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/release-2.3.0 by this push:
 new 67b5e1b  [BEAM-3692] Remove maven deploy plugin configuration with 
skip in the hadoop-input-format IO module
67b5e1b is described below

commit 67b5e1bab25d284cdac2127b47f44acc8e83499e
Author: Jean-Baptiste Onofré 
AuthorDate: Mon Feb 12 17:37:08 2018 +0100

[BEAM-3692] Remove maven deploy plugin configuration with skip in the 
hadoop-input-format IO module
---
 sdks/java/io/hadoop-input-format/pom.xml | 7 ---
 1 file changed, 7 deletions(-)

diff --git a/sdks/java/io/hadoop-input-format/pom.xml 
b/sdks/java/io/hadoop-input-format/pom.xml
index 0fbd13a..f4aa818 100644
--- a/sdks/java/io/hadoop-input-format/pom.xml
+++ b/sdks/java/io/hadoop-input-format/pom.xml
@@ -39,13 +39,6 @@
   none
 
   
-  
-org.apache.maven.plugins
-maven-deploy-plugin
-
-  true
-
-  
 
   
 

-- 
To stop receiving notification emails like this one, please contact
jbono...@apache.org.


Build failed in Jenkins: beam_PostCommit_Python_ValidatesContainer_Dataflow #30

2018-02-13 Thread Apache Jenkins Server
gt; 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> 
apache-beam-2.4.0.dev0/apache_beam/utils
Writing apache-beam-2.4.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.4.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nocapture \
  --processes=1 \
  --process-timeout=900 \
  --test-pipeline-options=" \
--runner=TestDataflowRunner \
--project=$PROJECT \
--worker_harness_container_image=$CONTAINER:$TAG \
--staging_location=$GCS_LOCATION/staging-validatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/container/local/lib/python2.7/site-packages/setuptools/dist.py>:355:
 UserWarning: Normalizing '2.4.0.dev' to '2.4.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesContainer_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/gcsio.py>:166:
 DeprecationWarning: object() takes no parameters
  super(GcsIO, cls).__new__(cls, storage_client))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok

--
Ran 1 test in 398.962s

OK

# Delete the container locally and remotely
docker rmi $CONTAINER:$TAG
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180213-153222
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:9e53593fe2f4001a54571aed68dfbabfebe12b19b9f9965f3917f1a302741678
Deleted: sha256:2e8ffc5a67299d10c4d7faa33cea1f359923812df6cf9e2d9d7ce35639fc197b
Deleted: sha256:21da98ddc2c62246862e43b0e841c9fa65fc40b37d03b19a824adcbde3fb7675
Deleted: sha256:c75ab5f361ae2d0ee7cae188c6b8f

[beam] 01/01: Merge pull request #4637

2018-02-13 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 1488cb90a69ee2bf3adc43f7157ba3641ed3e04d
Merge: 23ea278 c63315d
Author: Thomas Groh 
AuthorDate: Tue Feb 13 09:49:01 2018 -0600

Merge pull request #4637

[BEAM-3646] Add Javadoc on how Teardown is best-effort

 .../java/org/apache/beam/sdk/transforms/DoFn.java | 19 +--
 .../java/org/apache/beam/sdk/transforms/ParDo.java|  5 +
 2 files changed, 18 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[beam] branch master updated (23ea278 -> 1488cb9)

2018-02-13 Thread tgroh
This is an automated email from the ASF dual-hosted git repository.

tgroh pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 23ea278  Merge pull request #4664: [BEAM-3637] Fix for HBaseIOTest - 
random table names for every test
 add c63315d  Add Javadoc on how Teardown is best-effort
 new 1488cb9  Merge pull request #4637

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../java/org/apache/beam/sdk/transforms/DoFn.java | 19 +--
 .../java/org/apache/beam/sdk/transforms/ParDo.java|  5 +
 2 files changed, 18 insertions(+), 6 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
tg...@apache.org.


[jira] [Resolved] (BEAM-3637) HBaseIOTest methods do not clean up tables

2018-02-13 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-3637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía resolved BEAM-3637.

   Resolution: Fixed
Fix Version/s: 2.4.0

> HBaseIOTest methods do not clean up tables
> --
>
> Key: BEAM-3637
> URL: https://issues.apache.org/jira/browse/BEAM-3637
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Reporter: Kenneth Knowles
>Assignee: Alexey Romanenko
>Priority: Minor
>  Labels: beginner, newbie, starter
> Fix For: 2.4.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (c14fab0 -> 23ea278)

2018-02-13 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from c14fab0  Merge pull request #4671: Two fixes to common URN handling
 add 3b8dad4  [BEAM-3637] HBaseIOTest - random table names for every test
 new 23ea278  Merge pull request #4664: [BEAM-3637] Fix for HBaseIOTest - 
random table names for every test

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../org/apache/beam/sdk/io/hbase/HBaseIOTest.java  | 49 +++---
 1 file changed, 33 insertions(+), 16 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


[beam] 01/01: Merge pull request #4664: [BEAM-3637] Fix for HBaseIOTest - random table names for every test

2018-02-13 Thread iemejia
This is an automated email from the ASF dual-hosted git repository.

iemejia pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 23ea278b877f5f0cb3b08c04e83a4655b48bfdca
Merge: c14fab0 3b8dad4
Author: Ismaël Mejía 
AuthorDate: Tue Feb 13 16:04:34 2018 +0100

Merge pull request #4664: [BEAM-3637] Fix for HBaseIOTest - random table 
names for every test

 .../org/apache/beam/sdk/io/hbase/HBaseIOTest.java  | 49 +++---
 1 file changed, 33 insertions(+), 16 deletions(-)

-- 
To stop receiving notification emails like this one, please contact
ieme...@apache.org.


Jenkins build is back to normal : beam_PostCommit_Python_ValidatesContainer_Dataflow #29

2018-02-13 Thread Apache Jenkins Server
See 




[jira] [Updated] (BEAM-3696) MQTT IO should compute watermark and ack messages outside of finalizeCheckpoint method

2018-02-13 Thread Maxim Kolchin (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-3696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maxim Kolchin updated BEAM-3696:

Affects Version/s: (was: 2.3.0)

> MQTT IO should compute watermark and ack messages outside of 
> finalizeCheckpoint method
> --
>
> Key: BEAM-3696
> URL: https://issues.apache.org/jira/browse/BEAM-3696
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-extensions
>Affects Versions: 2.2.0
> Environment: - Flink - beam-runners-flink_2.10:2.2.0
> - Beam and related jars - 2.2.0
>Reporter: Maxim Kolchin
>Assignee: Reuven Lax
>Priority: Major
>
> I'm experiencing a situation when an incoming message isn't acknowledged 
> (therefore in sometime broker resend it) and the watermark is not updated 
> while new messages are coming continuously.
> After some time I've discovered that this situation is related to the fact 
> that finalizaCheckpoint is not being called.
> I took a look at the Pubsub IO implementation and found that they expect such 
> situation and do not compute watermark and ack messages in 
> finalizeCheckpoint. Here is the comment about that: 
> [https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubUnboundedSource.java#L289]
> Should MQTT IO do the same?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (BEAM-3696) MQTT IO should compute watermark and ack messages outside of finalizeCheckpoint method

2018-02-13 Thread Maxim Kolchin (JIRA)
Maxim Kolchin created BEAM-3696:
---

 Summary: MQTT IO should compute watermark and ack messages outside 
of finalizeCheckpoint method
 Key: BEAM-3696
 URL: https://issues.apache.org/jira/browse/BEAM-3696
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-extensions
Affects Versions: 2.2.0, 2.3.0
 Environment: - Flink - beam-runners-flink_2.10:2.2.0
- Beam and related jars - 2.2.0
Reporter: Maxim Kolchin
Assignee: Reuven Lax


I'm experiencing a situation when an incoming message isn't acknowledged 
(therefore in sometime broker resend it) and the watermark is not updated while 
new messages are coming continuously.

After some time I've discovered that this situation is related to the fact that 
finalizaCheckpoint is not being called.

I took a look at the Pubsub IO implementation and found that they expect such 
situation and do not compute watermark and ack messages in finalizeCheckpoint. 
Here is the comment about that: 
[https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/pubsub/PubsubUnboundedSource.java#L289]

Should MQTT IO do the same?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #5952

2018-02-13 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_Python #909

2018-02-13 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c14fab0f66374e572e5b0681fe3b652dff6185de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c14fab0f66374e572e5b0681fe3b652dff6185de
Commit message: "Merge pull request #4671: Two fixes to common URN handling"
 > git rev-list c14fab0f66374e572e5b0681fe3b652dff6185de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3234956588939664031.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6170907383706268796.sh
+ rm -rf .env
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8650295179194438737.sh
+ virtualenv .env --system-site-packages
New python executable in .env/bin/python
Installing setuptools, pip...done.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3786709867960281125.sh
+ .env/bin/pip install --upgrade setuptools pip
Downloading/unpacking setuptools from 
https://pypi.python.org/packages/43/41/033a273f9a25cb63050a390ee8397acbc7eae2159195d85f06f17e7be45a/setuptools-38.5.1-py2.py3-none-any.whl#md5=908b8b5e50bf429e520b2b5fa1b350e5
Downloading/unpacking pip from 
https://pypi.python.org/packages/b6/ac/7015eb97dc749283ffdec1c3a88ddb8ae03b8fad0f0e611408f196358da3/pip-9.0.1-py2.py3-none-any.whl#md5=297dbd16ef53bcef0447d245815f5144
Installing collected packages: setuptools, pip
  Found existing installation: setuptools 2.2
Uninstalling setuptools:
  Successfully uninstalled setuptools
  Found existing installation: pip 1.5.4
Uninstalling pip:
  Successfully uninstalled pip
Successfully installed setuptools pip
Cleaning up...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5724309519482874741.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5870978030713612065.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages 
(from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy==1.13.3 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in 
/usr/local/lib/python2.7/dist-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in 
/home/jenkins/.local/lib/python2.7/site-packages (from -r 
PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /usr/local