Build failed in Jenkins: beam_PerformanceTests_Python #1443

2018-09-15 Thread Apache Jenkins Server
See 


Changes:

[github] Fix typo in the guide: "amouint" -> "amount"

[boyuanz] Add more test cases in BigQueryToTable

[qinyeli] Interactive Beam -- adding edges by node pair

[pablo] Adding Autocomplete IT for Java

[pablo] Fix spotless

[boyuanz] Exclude BigQueryToTableIT tests from direct runner

--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 89c91dbd5bcc1ca4603475bd6ad86a4d7cc228d2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 89c91dbd5bcc1ca4603475bd6ad86a4d7cc228d2
Commit message: "Merge pull request #6339 from regata/master"
 > git rev-list --no-walk b4d1ef316a0b00f5e0616ad0a067b841d05d703c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3309589846031017727.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8722178410808414666.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6417506739128643674.sh
+ virtualenv 

New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6387603277225739211.sh
+ 

 install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (40.2.0)
Requirement already up-to-date: pip in 
./env/.perfkit_env/lib/python2.7/site-packages (18.0)
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins5763949806454530458.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git 

Cloning into 
'
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3094672524833530379.sh
+ 

 install -r 

Collecting absl-py (from -r 

 (line 14))
Collecting jinja2>=2.7 (from -r 

 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (from -r 

 (line 16)) (40.2.0)
Collecting colorlog[windows]==2.6.0 (from -r 

 (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r 

 (line 18))
Collecting futures>=3.0.3 (from -r 


Build failed in Jenkins: beam_PostCommit_Python_PVR_Flink_Gradle #29

2018-09-15 Thread Apache Jenkins Server
See 


--
[...truncated 6.25 MB...]
[grpc-default-executor-0] WARN sdk_worker_main._load_main_session - No session 
file found: /tmp/staged/pickled_main_session. Functions defined in __main__ 
(interactive session) may fail. 
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness 
started with pipeline_options: {'runner': u'None', 'streaming': True, 
'experiments': [u'beam_fn_api'], 'sdk_location': u'container', 'job_name': 
u'test_windowing_1537078343.7', 'job_endpoint': u'localhost:45375'}
[grpc-default-executor-0] INFO sdk_worker.__init__ - Creating insecure control 
channel.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Control channel 
established.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Initializing SDKHarness 
with 12 workers.
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - 
Beam Fn Control client connected with id 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-0] INFO sdk_worker.run - Got work 3
[grpc-default-executor-0] INFO sdk_worker.run - Got work 5
[grpc-default-executor-0] INFO sdk_worker.run - Got work 4
[grpc-default-executor-0] INFO sdk_worker.run - Got work 6
[grpc-default-executor-0] INFO sdk_worker.run - Got work 7
[grpc-default-executor-0] INFO sdk_worker.create_state_handler - Creating 
channel for localhost:34517
[grpc-default-executor-0] INFO sdk_worker.run - Got work 8
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client 
connected.
[grpc-default-executor-0] INFO data_plane.create_data_channel - Creating 
channel for localhost:33595
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (0bf1acf200b64f0440ab8e2eeb40fc23) switched from 
RUNNING to FINISHED.
[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (0bf1acf200b64f0440ab8e2eeb40fc23).
[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task Source: Collection Source 
-> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (0bf1acf200b64f0440ab8e2eeb40fc23) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task Source: 
Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem 0bf1acf200b64f0440ab8e2eeb40fc23.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph 

Build failed in Jenkins: beam_PreCommit_Website_Cron #65

2018-09-15 Thread Apache Jenkins Server
See 


Changes:

[github] Fix typo in the guide: "amouint" -> "amount"

[boyuanz] Add more test cases in BigQueryToTable

[qinyeli] Interactive Beam -- adding edges by node pair

[pablo] Adding Autocomplete IT for Java

[pablo] Fix spotless

[boyuanz] Exclude BigQueryToTableIT tests from direct runner

--
[...truncated 7.84 KB...]

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:assemble (Thread[Task worker for ':buildSrc' Thread 8,5,main]) completed. Took 
0.0 secs.
:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 8,5,main]) started.

> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 1.372 secs.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 0.0 secs.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovyGradle'.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 0.028 secs.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc' Thread 
8,5,main]) started.

> Task :buildSrc:spotlessGroovyGradleCheck
Skipping task ':buildSrc:spotlessGroovyGradleCheck' as it has no actions.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc' Thread 
8,5,main]) completed. Took 0.0 secs.
:spotlessCheck (Thread[Task worker for ':buildSrc' Thread 8,5,main]) started.

> Task :buildSrc:spotlessCheck
Skipping task ':buildSrc:spotlessCheck' as it has no actions.
:spotlessCheck (Thread[Task worker for ':buildSrc' Thread 8,5,main]) completed. 
Took 0.0 secs.
:compileTestJava (Thread[Task worker for ':buildSrc' Thread 8,5,main]) started.

> Task :buildSrc:compileTestJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestJava' as it has no source files and no 
previous output files.
:compileTestJava (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 0.003 secs.
:compileTestGroovy (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
started.

> Task :buildSrc:compileTestGroovy NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestGroovy' as it has no source files and no 
previous output files.
:compileTestGroovy (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 0.002 secs.
:processTestResources (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
started.

> Task :buildSrc:processTestResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processTestResources' as it has no source files and no 
previous output files.
:processTestResources (Thread[Task worker for ':buildSrc' Thread 8,5,main]) 
completed. Took 0.001 secs.
:testClasses (Thread[Task worker for ':buildSrc' Thread 8,5,main]) started.

> Task :buildSrc:testClasses UP-TO-DATE
Skipping task ':buildSrc:testClasses' as it has no actions.
:testClasses (Thread[Task worker for ':buildSrc' Thread 8,5,main]) completed. 
Took 0.0 secs.
:test (Thread[Task worker for ':buildSrc' Thread 8,5,main]) started.

> Task :buildSrc:test NO-SOURCE
Skipping task ':buildSrc:test' as it has no source files and no previous output 
files.
:test (Thread[Task worker for ':buildSrc' Thread 8,5,main]) completed. Took 
0.004 secs.
:check (Thread[Task worker for ':buildSrc' Th

[beam] branch master updated (468d3e4 -> 89c91db)

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 468d3e4  Merge pull request #6372 from pabloem/autocomplete-it
 add 96e8674  Fix typo in the guide: "amouint" -> "amount"
 new 89c91db  Merge pull request #6339 from regata/master

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 website/src/documentation/programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[beam] 01/01: Merge pull request #6339 from regata/master

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 89c91dbd5bcc1ca4603475bd6ad86a4d7cc228d2
Merge: 468d3e4 96e8674
Author: Pablo 
AuthorDate: Sat Sep 15 22:52:34 2018 -0700

Merge pull request #6339 from regata/master

Fix typo in the guide: "amouint" -> "amount"

 website/src/documentation/programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



Jenkins build is back to normal : beam_PostCommit_Python_PVR_Flink_Gradle #26

2018-09-15 Thread Apache Jenkins Server
See 




[beam] branch master updated (7cf6d3d -> 468d3e4)

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 7cf6d3d  Merge pull request #6396 from qinyeli/master
 add fe07153  Adding Autocomplete IT for Java
 add 90c6cfc  Fix spotless
 add 468d3e4  Merge pull request #6372 from pabloem/autocomplete-it

No new revisions were added by this update.

Summary of changes:
 .../beam/examples/complete/AutoComplete.java   | 41 ++-
 .../beam/examples/complete/AutoCompleteIT.java | 59 ++
 2 files changed, 98 insertions(+), 2 deletions(-)
 create mode 100644 
examples/java/src/test/java/org/apache/beam/examples/complete/AutoCompleteIT.java



[beam] branch master updated (30bb682 -> 7cf6d3d)

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 30bb682  Merge pull request #6360 from boyuanzz/more_bq_tests
 add f20e88e  Interactive Beam -- adding edges by node pair
 new 7cf6d3d  Merge pull request #6396 from qinyeli/master

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache_beam/runners/interactive/display/pipeline_graph.py   | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)



[beam] 01/01: Merge pull request #6396 from qinyeli/master

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 7cf6d3dd2ed164acafef21e05f232bbd8d60d168
Merge: 30bb682 f20e88e
Author: Pablo 
AuthorDate: Sat Sep 15 20:33:28 2018 -0700

Merge pull request #6396 from qinyeli/master

Interactive Beam -- adding edges by node pair

 .../apache_beam/runners/interactive/display/pipeline_graph.py   | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)



[beam] 01/01: Merge pull request #6360 from boyuanzz/more_bq_tests

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 30bb6828fe00cc676df9e3d10d2dbb63d3fd8e3e
Merge: b4d1ef3 a6065dd
Author: Pablo 
AuthorDate: Sat Sep 15 20:31:56 2018 -0700

Merge pull request #6360 from boyuanzz/more_bq_tests

[BEAM-5383] Add more test cases in BigQueryToTable IT.

 sdks/java/io/google-cloud-platform/build.gradle|   1 +
 .../sdk/io/gcp/bigquery/BigQueryToTableIT.java | 255 +
 .../beam/sdk/io/gcp/testing/BigqueryClient.java|  57 -
 3 files changed, 259 insertions(+), 54 deletions(-)



[beam] branch master updated (b4d1ef3 -> 30bb682)

2018-09-15 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from b4d1ef3  Merge pull request #6317: [BEAM-4461]  Add mapping between 
FieldType and Java types.
 add 8fd361f  Add more test cases in BigQueryToTable
 add a6065dd  Exclude BigQueryToTableIT tests from direct runner
 new 30bb682  Merge pull request #6360 from boyuanzz/more_bq_tests

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/java/io/google-cloud-platform/build.gradle|   1 +
 .../sdk/io/gcp/bigquery/BigQueryToTableIT.java | 255 +
 .../beam/sdk/io/gcp/testing/BigqueryClient.java|  57 -
 3 files changed, 259 insertions(+), 54 deletions(-)



Build failed in Jenkins: beam_PerformanceTests_Python #1442

2018-09-15 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b4d1ef316a0b00f5e0616ad0a067b841d05d703c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b4d1ef316a0b00f5e0616ad0a067b841d05d703c
Commit message: "Merge pull request #6317: [BEAM-4461]  Add mapping between 
FieldType and Java types."
 > git rev-list --no-walk b4d1ef316a0b00f5e0616ad0a067b841d05d703c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8393226262721141060.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6028695581914896336.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3091410057836607230.sh
+ virtualenv 

New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6542649170006586961.sh
+ 

 install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (40.2.0)
Requirement already up-to-date: pip in 
./env/.perfkit_env/lib/python2.7/site-packages (18.0)
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins2626197463676622897.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git 

Cloning into 
'
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1404098169246779399.sh
+ 

 install -r 

Collecting absl-py (from -r 

 (line 14))
Collecting jinja2>=2.7 (from -r 

 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (from -r 

 (line 16)) (40.2.0)
Collecting colorlog[windows]==2.6.0 (from -r 

 (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r 

 (line 18))
Collecting futures>=3.0.3 (from -r 

 (line 19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r 


Build failed in Jenkins: beam_PostCommit_Python_PVR_Flink_Gradle #25

2018-09-15 Thread Apache Jenkins Server
See 


--
[...truncated 6.26 MB...]
[grpc-default-executor-1] INFO sdk_worker.__init__ - Creating insecure control 
channel.
[grpc-default-executor-1] INFO sdk_worker.__init__ - Control channel 
established.
[grpc-default-executor-1] INFO sdk_worker.__init__ - Initializing SDKHarness 
with 12 workers.
[grpc-default-executor-1] INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - 
Beam Fn Control client connected with id 1
[grpc-default-executor-1] INFO sdk_worker.run - Got work 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 6
[grpc-default-executor-0] INFO sdk_worker.run - Got work 5
[grpc-default-executor-0] INFO sdk_worker.run - Got work 4
[grpc-default-executor-0] INFO sdk_worker.run - Got work 3
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-0] INFO sdk_worker.run - Got work 8
[grpc-default-executor-0] INFO sdk_worker.run - Got work 7
[grpc-default-executor-0] INFO sdk_worker.create_state_handler - Creating 
channel for localhost:45705
[grpc-default-executor-1] INFO data_plane.create_data_channel - Creating 
channel for localhost:44673
[grpc-default-executor-1] INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client 
connected.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (14901b5dc6f31df8416411349226b761) switched from 
RUNNING to FINISHED.
[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (14901b5dc6f31df8416411349226b761).
[Source: Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task Source: Collection Source 
-> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (14901b5dc6f31df8416411349226b761) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task Source: 
Collection Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem 14901b5dc6f31df8416411349226b761.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Collection 
Source -> 
19Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1) (14901b5dc6f31df8416411349226b761) switched from 
RUNNING to FINISHED.
[grpc-default-executor-1] INFO sdk_worker.run - Got work 9
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish 

[Source: Collection Source -> 
31assert_that/Create/Read/Impulse.None/jenkins-docker-apache.bintray.io/beam/python:latest:0
 -> ToKeyedWorkItem (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - 
Source: 

Build failed in Jenkins: beam_PreCommit_Website_Cron #64

2018-09-15 Thread Apache Jenkins Server
See 


--
[...truncated 8.48 KB...]
> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:spotlessGroovy (Thread[Daemon worker,5,main]) completed. Took 1.424 secs.
:spotlessGroovyCheck (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:spotlessGroovyCheck (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:spotlessGroovyGradle (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovyGradle'.
:spotlessGroovyGradle (Thread[Daemon worker,5,main]) completed. Took 0.033 secs.
:spotlessGroovyGradleCheck (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:spotlessGroovyGradleCheck
Skipping task ':buildSrc:spotlessGroovyGradleCheck' as it has no actions.
:spotlessGroovyGradleCheck (Thread[Daemon worker,5,main]) completed. Took 0.0 
secs.
:spotlessCheck (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:spotlessCheck
Skipping task ':buildSrc:spotlessCheck' as it has no actions.
:spotlessCheck (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:compileTestJava (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:compileTestJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestJava' as it has no source files and no 
previous output files.
:compileTestJava (Thread[Daemon worker,5,main]) completed. Took 0.002 secs.
:compileTestGroovy (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:compileTestGroovy NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestGroovy' as it has no source files and no 
previous output files.
:compileTestGroovy (Thread[Daemon worker,5,main]) completed. Took 0.002 secs.
:processTestResources (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:processTestResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processTestResources' as it has no source files and no 
previous output files.
:processTestResources (Thread[Daemon worker,5,main]) completed. Took 0.002 secs.
:testClasses (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:testClasses UP-TO-DATE
Skipping task ':buildSrc:testClasses' as it has no actions.
:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:test (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:test NO-SOURCE
Skipping task ':buildSrc:test' as it has no source files and no previous output 
files.
:test (Thread[Daemon worker,5,main]) completed. Took 0.004 secs.
:check (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:check
Skipping task ':buildSrc:check' as it has no actions.
:check (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:build (Thread[Daemon worker,5,main]) started.

> Task :buildSrc:build
Skipping task ':buildSrc:build' as it has no actions.
:build (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) remove files older 
than Sun Sep 09 00:00:24 UTC 2018.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing 717 cache 
entries (41 MB reclaimed).
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.228 
secs.
Settings evaluated using settings file 
'
Using local directory build cache for the root build (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).
Projects loaded. Root project using build file 
'

[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144638&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144638
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 20:59
Start Date: 15/Sep/18 20:59
Worklog Time Spent: 10m 
  Work Description: herohde commented on issue #6395: [BEAM-5378] Update go 
wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#issuecomment-421633711
 
 
   Note that the wordcount examples are written to closely fit this guide:
   
   https://beam.apache.org/get-started/wordcount-example/
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144638)
Time Spent: 2h  (was: 1h 50m)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ ./autocomplete --project fair-app-213019 
> --runner dataflow --staging_location=gs://fair-app-213019/staging-test2 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:35:26 Running autocomplete
> Unable to encode combiner for lifting: failed to encode custom coder: bad 
> underlying type: bad field type: bad element: unencodable type: interface 
> {}2018/09/11 15:35:26 Using running binary as worker binary: './autocomplete'
> 2018/09/11 15:35:26 Staging worker binary: ./autocomplete
> ├── contains
> │   └── contains.go
> Fails when running debug.Head for some mysterious reason, might have to do 
> with the param passing into the x,y iterator. Frankly I dont know and could 
> not figure.
> But removing the debug.Head call everything works as expected and succeeds.
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> Fails because of extractFn which is a struct is not registered through the 
> beam.RegisterType (is this a must or not?)
> It works as a work around at least
> ➜  combine git:(master) ✗ ./combine 
> --output=fair-app-213019:combineoutput.test --project=fair-app-213019 
> --runner=dataflow --staging_location=gs://203019-staging/ 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:40:50 Running c

Build failed in Jenkins: beam_PerformanceTests_Python #1441

2018-09-15 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b4d1ef316a0b00f5e0616ad0a067b841d05d703c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b4d1ef316a0b00f5e0616ad0a067b841d05d703c
Commit message: "Merge pull request #6317: [BEAM-4461]  Add mapping between 
FieldType and Java types."
 > git rev-list --no-walk b4d1ef316a0b00f5e0616ad0a067b841d05d703c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7901553068981721935.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1092787986495895284.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3300922827179570076.sh
+ virtualenv 

New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins8384060270464927121.sh
+ 

 install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (40.2.0)
Requirement already up-to-date: pip in 
./env/.perfkit_env/lib/python2.7/site-packages (18.0)
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7850769059533544078.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git 

Cloning into 
'
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins4747666853452279997.sh
+ 

 install -r 

Collecting absl-py (from -r 

 (line 14))
Collecting jinja2>=2.7 (from -r 

 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (from -r 

 (line 16)) (40.2.0)
Collecting colorlog[windows]==2.6.0 (from -r 

 (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r 

 (line 18))
Collecting futures>=3.0.3 (from -r 

 (line 19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r 


Build failed in Jenkins: beam_PreCommit_Website_Cron #63

2018-09-15 Thread Apache Jenkins Server
See 


--
[...truncated 7.79 KB...]

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:assemble (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.0 secs.
:spotlessGroovy (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:spotlessGroovy (Thread[Task worker for ':buildSrc',5,main]) completed. Took 
1.498 secs.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc',5,main]) completed. 
Took 0.0 secs.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovyGradle'.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc',5,main]) completed. 
Took 0.03 secs.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:spotlessGroovyGradleCheck
Skipping task ':buildSrc:spotlessGroovyGradleCheck' as it has no actions.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc',5,main]) 
completed. Took 0.0 secs.
:spotlessCheck (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:spotlessCheck
Skipping task ':buildSrc:spotlessCheck' as it has no actions.
:spotlessCheck (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.0 
secs.
:compileTestJava (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:compileTestJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestJava' as it has no source files and no 
previous output files.
:compileTestJava (Thread[Task worker for ':buildSrc',5,main]) completed. Took 
0.003 secs.
:compileTestGroovy (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:compileTestGroovy NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestGroovy' as it has no source files and no 
previous output files.
:compileTestGroovy (Thread[Task worker for ':buildSrc',5,main]) completed. Took 
0.002 secs.
:processTestResources (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:processTestResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processTestResources' as it has no source files and no 
previous output files.
:processTestResources (Thread[Task worker for ':buildSrc',5,main]) completed. 
Took 0.001 secs.
:testClasses (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:testClasses UP-TO-DATE
Skipping task ':buildSrc:testClasses' as it has no actions.
:testClasses (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.0 
secs.
:test (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:test NO-SOURCE
Skipping task ':buildSrc:test' as it has no source files and no previous output 
files.
:test (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.004 secs.
:check (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:check
Skipping task ':buildSrc:check' as it has no actions.
:check (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.0 secs.
:build (Thread[Task worker for ':buildSrc',5,main]) started.

> Task :buildSrc:build
Skipping task ':buildSrc:build' as it has no actions.
:build (Thread[Task worker for ':buildSrc',5,main]) completed. Took 0.0 secs.
Settings evaluated using settings file 
'

[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144604&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144604
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 17:59
Start Date: 15/Sep/18 17:59
Worklog Time Spent: 10m 
  Work Description: aaltay commented on a change in pull request #6395: 
[BEAM-5378] Update go wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#discussion_r217891409
 
 

 ##
 File path: sdks/go/examples/wordcap/wordcap.go
 ##
 @@ -31,7 +30,7 @@ import (
 )
 
 var (
-   input = flag.String("input", 
os.ExpandEnv("$GOPATH/src/github.com/apache/beam/sdks/go/data/haiku/old_pond.txt"),
 "Files to read.")
+   input = flag.String("input", 
"gs://apache-beam-samples/shakespeare/kinglear.txt", "File(s) to read.")
 
 Review comment:
   Agreed. I will start by deleting this one.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144604)
Time Spent: 1h 50m  (was: 1h 40m)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ ./autocomplete --project fair-app-213019 
> --runner dataflow --staging_location=gs://fair-app-213019/staging-test2 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:35:26 Running autocomplete
> Unable to encode combiner for lifting: failed to encode custom coder: bad 
> underlying type: bad field type: bad element: unencodable type: interface 
> {}2018/09/11 15:35:26 Using running binary as worker binary: './autocomplete'
> 2018/09/11 15:35:26 Staging worker binary: ./autocomplete
> ├── contains
> │   └── contains.go
> Fails when running debug.Head for some mysterious reason, might have to do 
> with the param passing into the x,y iterator. Frankly I dont know and could 
> not figure.
> But removing the debug.Head call everything works as expected and succeeds.
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> Fails because of extractFn which is a struct is not registered through the 
> beam.RegisterType (is this a must or not?)
> It works as a work around at

[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144603&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144603
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 17:14
Start Date: 15/Sep/18 17:14
Worklog Time Spent: 10m 
  Work Description: lostluck commented on issue #6395: [BEAM-5378] Update 
go wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#issuecomment-421601714
 
 
   Agreed. I'd rather delete them as examples if they're not good at that.
   Move/change them to be integration tests if they serve a useful purpose
   that way.
   The examples should ideally show best practices since things will end up
   being copy pasted from them all the time.
   I'd like to see a "getting started with Apache Beam" in Go at some point
   which can make
   better/minimal choices as part of a sequence of pipelines. Right now,
   minimal word count isn't clearly a learning tool.
   
   On Sat, Sep 15, 2018, 10:04 AM Henning Rohde 
   wrote:
   
   > *@herohde* commented on this pull request.
   > --
   >
   > In sdks/go/examples/wordcap/wordcap.go
   > :
   >
   > > @@ -31,7 +30,7 @@ import (
   >  )
   >
   >  var (
   > -  input = flag.String("input", 
os.ExpandEnv("$GOPATH/src/github.com/apache/beam/sdks/go/data/haiku/old_pond.txt"),
 "Files to read.")
   > +  input = flag.String("input", 
"gs://apache-beam-samples/shakespeare/kinglear.txt", "File(s) to read.")
   >
   > To be clear: I'm fine with these examples being deleted.
   >
   > —
   > You are receiving this because you were mentioned.
   > Reply to this email directly, view it on GitHub
   > , or mute
   > the thread
   > 

   > .
   >
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144603)
Time Spent: 1h 40m  (was: 1.5h)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ .

[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144601&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144601
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 17:04
Start Date: 15/Sep/18 17:04
Worklog Time Spent: 10m 
  Work Description: herohde commented on a change in pull request #6395: 
[BEAM-5378] Update go wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#discussion_r217890074
 
 

 ##
 File path: sdks/go/examples/wordcap/wordcap.go
 ##
 @@ -31,7 +30,7 @@ import (
 )
 
 var (
-   input = flag.String("input", 
os.ExpandEnv("$GOPATH/src/github.com/apache/beam/sdks/go/data/haiku/old_pond.txt"),
 "Files to read.")
+   input = flag.String("input", 
"gs://apache-beam-samples/shakespeare/kinglear.txt", "File(s) to read.")
 
 Review comment:
   To be clear: I'm fine with these examples being deleted.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144601)
Time Spent: 1.5h  (was: 1h 20m)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ ./autocomplete --project fair-app-213019 
> --runner dataflow --staging_location=gs://fair-app-213019/staging-test2 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:35:26 Running autocomplete
> Unable to encode combiner for lifting: failed to encode custom coder: bad 
> underlying type: bad field type: bad element: unencodable type: interface 
> {}2018/09/11 15:35:26 Using running binary as worker binary: './autocomplete'
> 2018/09/11 15:35:26 Staging worker binary: ./autocomplete
> ├── contains
> │   └── contains.go
> Fails when running debug.Head for some mysterious reason, might have to do 
> with the param passing into the x,y iterator. Frankly I dont know and could 
> not figure.
> But removing the debug.Head call everything works as expected and succeeds.
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> Fails because of extractFn which is a struct is not registered through the 
> beam.RegisterType (is this a must or not?)
> It works as a wor

[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144600&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144600
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 17:03
Start Date: 15/Sep/18 17:03
Worklog Time Spent: 10m 
  Work Description: herohde commented on a change in pull request #6395: 
[BEAM-5378] Update go wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#discussion_r217890062
 
 

 ##
 File path: sdks/go/examples/wordcap/wordcap.go
 ##
 @@ -31,7 +30,7 @@ import (
 )
 
 var (
-   input = flag.String("input", 
os.ExpandEnv("$GOPATH/src/github.com/apache/beam/sdks/go/data/haiku/old_pond.txt"),
 "Files to read.")
+   input = flag.String("input", 
"gs://apache-beam-samples/shakespeare/kinglear.txt", "File(s) to read.")
 
 Review comment:
   Sort of. They might be useful for new runners (ULR, Spark,..), but 
integration tests can serve the same purpose.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144600)
Time Spent: 1h 20m  (was: 1h 10m)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ ./autocomplete --project fair-app-213019 
> --runner dataflow --staging_location=gs://fair-app-213019/staging-test2 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:35:26 Running autocomplete
> Unable to encode combiner for lifting: failed to encode custom coder: bad 
> underlying type: bad field type: bad element: unencodable type: interface 
> {}2018/09/11 15:35:26 Using running binary as worker binary: './autocomplete'
> 2018/09/11 15:35:26 Staging worker binary: ./autocomplete
> ├── contains
> │   └── contains.go
> Fails when running debug.Head for some mysterious reason, might have to do 
> with the param passing into the x,y iterator. Frankly I dont know and could 
> not figure.
> But removing the debug.Head call everything works as expected and succeeds.
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> Fails because of extractFn which is a struct is not registered through the 
> b

[jira] [Work logged] (BEAM-4461) Create a library of useful transforms that use schemas

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4461?focusedWorklogId=144596&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144596
 ]

ASF GitHub Bot logged work on BEAM-4461:


Author: ASF GitHub Bot
Created on: 15/Sep/18 16:41
Start Date: 15/Sep/18 16:41
Worklog Time Spent: 10m 
  Work Description: reuvenlax commented on issue #6316: [BEAM-4461] Add 
Unnest transform.
URL: https://github.com/apache/beam/pull/6316#issuecomment-421597073
 
 
   @akedin The problem seems to be that SQL tries to add the same field twice 
to a schema when joining. I can remove this check, but it seems to me like a 
bug in SQL.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144596)
Time Spent: 9h 40m  (was: 9.5h)

> Create a library of useful transforms that use schemas
> --
>
> Key: BEAM-4461
> URL: https://issues.apache.org/jira/browse/BEAM-4461
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-java-core
>Reporter: Reuven Lax
>Assignee: Reuven Lax
>Priority: Major
>  Time Spent: 9h 40m
>  Remaining Estimate: 0h
>
> e.g. JoinBy(fields). Project, Filter, etc.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5378) Ensure all Go SDK examples run successfully

2018-09-15 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5378?focusedWorklogId=144595&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-144595
 ]

ASF GitHub Bot logged work on BEAM-5378:


Author: ASF GitHub Bot
Created on: 15/Sep/18 16:27
Start Date: 15/Sep/18 16:27
Worklog Time Spent: 10m 
  Work Description: aaltay commented on a change in pull request #6395: 
[BEAM-5378] Update go wordcap example to work on Dataflow runner
URL: https://github.com/apache/beam/pull/6395#discussion_r217889158
 
 

 ##
 File path: sdks/go/examples/wordcap/wordcap.go
 ##
 @@ -31,7 +30,7 @@ import (
 )
 
 var (
-   input = flag.String("input", 
os.ExpandEnv("$GOPATH/src/github.com/apache/beam/sdks/go/data/haiku/old_pond.txt"),
 "Files to read.")
+   input = flag.String("input", 
"gs://apache-beam-samples/shakespeare/kinglear.txt", "File(s) to read.")
 
 Review comment:
   Can we remove those examples instead of moving them out of normal examples? 
Do they still serve a purpose?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 144595)
Time Spent: 1h 10m  (was: 1h)

> Ensure all Go SDK examples run successfully
> ---
>
> Key: BEAM-5378
> URL: https://issues.apache.org/jira/browse/BEAM-5378
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-go
>Affects Versions: Not applicable
>Reporter: Tomas Roos
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> I've been spending a day or so running through the example available for the 
> Go SDK in order to see what works and on what runner (direct, dataflow), and 
> what doesn't and here's the results.
> All available examples for the go sdk. For me as a new developer on apache 
> beam and dataflow it would be a tremendous value to have all examples running 
> because many of them have legitimate use-cases behind them. 
> {code:java}
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> ├── contains
> │   └── contains.go
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> │   ├── filter
> │   │   └── filter.go
> │   ├── join
> │   │   └── join.go
> │   ├── max
> │   │   └── max.go
> │   └── tornadoes
> │   └── tornadoes.go
> ├── debugging_wordcount
> │   └── debugging_wordcount.go
> ├── forest
> │   └── forest.go
> ├── grades
> │   └── grades.go
> ├── minimal_wordcount
> │   └── minimal_wordcount.go
> ├── multiout
> │   └── multiout.go
> ├── pingpong
> │   └── pingpong.go
> ├── streaming_wordcap
> │   └── wordcap.go
> ├── windowed_wordcount
> │   └── windowed_wordcount.go
> ├── wordcap
> │   └── wordcap.go
> ├── wordcount
> │   └── wordcount.go
> └── yatzy
> └── yatzy.go
> {code}
> All examples that are supposed to be runnable by the direct driver (not 
> depending on gcp platform services) are runnable.
> On the otherhand these are the tests that needs to be updated because its not 
> runnable on the dataflow platform for various reasons.
> I tried to figure them out and all I can do is to pin point at least where it 
> fails since my knowledge so far in the beam / dataflow internals is limited.
> .
> ├── complete
> │   └── autocomplete
> │   └── autocomplete.go
> Runs successfully if swapping the input to one of the shakespear data files 
> from gs://
> But when running this it yields a error from the top.Largest func (discussed 
> in another issue that top.Largest needs to have a serializeable combinator / 
> accumulator)
> ➜  autocomplete git:(master) ✗ ./autocomplete --project fair-app-213019 
> --runner dataflow --staging_location=gs://fair-app-213019/staging-test2 
> --worker_harness_container_image=apache-docker-beam-snapshots-docker.bintray.io/beam/go:20180515
>  
> 2018/09/11 15:35:26 Running autocomplete
> Unable to encode combiner for lifting: failed to encode custom coder: bad 
> underlying type: bad field type: bad element: unencodable type: interface 
> {}2018/09/11 15:35:26 Using running binary as worker binary: './autocomplete'
> 2018/09/11 15:35:26 Staging worker binary: ./autocomplete
> ├── contains
> │   └── contains.go
> Fails when running debug.Head for some mysterious reason, might have to do 
> with the param passing into the x,y iterator. Frankly I dont know and could 
> not figure.
> But removing the debug.Head call everything works as expected and succeeds.
> ├── cookbook
> │   ├── combine
> │   │   └── combine.go
> Fails because of extractFn which is a struct is not registered through the 
> beam.Registe

Build failed in Jenkins: beam_PerformanceTests_Python #1440

2018-09-15 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b4d1ef316a0b00f5e0616ad0a067b841d05d703c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b4d1ef316a0b00f5e0616ad0a067b841d05d703c
Commit message: "Merge pull request #6317: [BEAM-4461]  Add mapping between 
FieldType and Java types."
 > git rev-list --no-walk b4d1ef316a0b00f5e0616ad0a067b841d05d703c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins9002884079140434289.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3968368254152861212.sh
+ rm -rf 

[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins7384559299700319335.sh
+ virtualenv 

New python executable in 

Also creating executable in 

Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins6187846504486077746.sh
+ 

 install --upgrade setuptools pip
Requirement already up-to-date: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (40.2.0)
Requirement already up-to-date: pip in 
./env/.perfkit_env/lib/python2.7/site-packages (18.0)
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins3409182361447168932.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git 

Cloning into 
'
[beam_PerformanceTests_Python] $ /bin/bash -xe 
/tmp/jenkins1994549218345607923.sh
+ 

 install -r 

Collecting absl-py (from -r 

 (line 14))
Collecting jinja2>=2.7 (from -r 

 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (from -r 

 (line 16)) (40.2.0)
Collecting colorlog[windows]==2.6.0 (from -r 

 (line 17))
  Using cached 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r 

 (line 18))
Collecting futures>=3.0.3 (from -r 

 (line 19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r 


Jenkins build is back to normal : beam_PostCommit_Python_PVR_Flink_Gradle #23

2018-09-15 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PreCommit_Website_Cron #62

2018-09-15 Thread Apache Jenkins Server
See 


--
[...truncated 7.85 KB...]

> Task :buildSrc:assemble
Skipping task ':buildSrc:assemble' as it has no actions.
:assemble (Thread[Task worker for ':buildSrc' Thread 5,5,main]) completed. Took 
0.0 secs.
:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 5,5,main]) started.

> Task :buildSrc:spotlessGroovy
file or directory 
'
 not found
file or directory 
'
 not found
file or directory 
'
 not found
Caching disabled for task ':buildSrc:spotlessGroovy': Caching has not been 
enabled for the task
Task ':buildSrc:spotlessGroovy' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovy'.
file or directory 
'
 not found
:spotlessGroovy (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 1.375 secs.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
started.

> Task :buildSrc:spotlessGroovyCheck
Skipping task ':buildSrc:spotlessGroovyCheck' as it has no actions.
:spotlessGroovyCheck (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 0.0 secs.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
started.

> Task :buildSrc:spotlessGroovyGradle
Caching disabled for task ':buildSrc:spotlessGroovyGradle': Caching has not 
been enabled for the task
Task ':buildSrc:spotlessGroovyGradle' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task 
':buildSrc:spotlessGroovyGradle'.
:spotlessGroovyGradle (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 0.032 secs.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc' Thread 
5,5,main]) started.

> Task :buildSrc:spotlessGroovyGradleCheck
Skipping task ':buildSrc:spotlessGroovyGradleCheck' as it has no actions.
:spotlessGroovyGradleCheck (Thread[Task worker for ':buildSrc' Thread 
5,5,main]) completed. Took 0.0 secs.
:spotlessCheck (Thread[Task worker for ':buildSrc' Thread 5,5,main]) started.

> Task :buildSrc:spotlessCheck
Skipping task ':buildSrc:spotlessCheck' as it has no actions.
:spotlessCheck (Thread[Task worker for ':buildSrc' Thread 5,5,main]) completed. 
Took 0.0 secs.
:compileTestJava (Thread[Task worker for ':buildSrc' Thread 5,5,main]) started.

> Task :buildSrc:compileTestJava NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestJava' as it has no source files and no 
previous output files.
:compileTestJava (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 0.003 secs.
:compileTestGroovy (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
started.

> Task :buildSrc:compileTestGroovy NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:compileTestGroovy' as it has no source files and no 
previous output files.
:compileTestGroovy (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 0.003 secs.
:processTestResources (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
started.

> Task :buildSrc:processTestResources NO-SOURCE
file or directory 
'
 not found
Skipping task ':buildSrc:processTestResources' as it has no source files and no 
previous output files.
:processTestResources (Thread[Task worker for ':buildSrc' Thread 5,5,main]) 
completed. Took 0.003 secs.
:testClasses (Thread[Task worker for ':buildSrc' Thread 5,5,main]) started.

> Task :buildSrc:testClasses UP-TO-DATE
Skipping task ':buildSrc:testClasses' as it has no actions.
:testClasses (Thread[Task worker for ':buildSrc' Thread 5,5,main]) completed. 
Took 0.0 secs.
:test (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:test NO-SOURCE
Skipping task ':buildSrc:test' as it has no source files and no previous output 
files.
:test (Thread[Task worker for ':buildSrc' Thread 3,5,main]) completed. Took 
0.005 secs.
:check (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:check
Skipping task ':buildSrc:check' as it has no actions.
:check (Thread[Task worker for ':buildSrc' Thread 3,5,main]) completed. Took 
0.0 secs.
:build (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:build
Skipping task ':buildSrc: