Build failed in Jenkins: beam_PreCommit_Java_Cron #448

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 44.82 MB...]
:beam-sdks-java-io-hadoop-common:spotlessJavaCheck (Thread[Task worker for ':' 
Thread 3,5,main]) started.

> Task :beam-sdks-java-io-hadoop-common:spotlessJavaCheck
Skipping task ':beam-sdks-java-io-hadoop-common:spotlessJavaCheck' as it has no 
actions.
:beam-sdks-java-io-hadoop-common:spotlessJavaCheck (Thread[Task worker for ':' 
Thread 3,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-hadoop-common:spotlessCheck (Thread[Task worker for ':' 
Thread 3,5,main]) started.

> Task :beam-sdks-java-io-hadoop-common:spotlessCheck
Skipping task ':beam-sdks-java-io-hadoop-common:spotlessCheck' as it has no 
actions.
:beam-sdks-java-io-hadoop-common:spotlessCheck (Thread[Task worker for ':' 
Thread 3,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-hadoop-common:test (Thread[Task worker for ':' Thread 
3,5,main]) started.
Gradle Test Executor 138 started executing tests.
Gradle Test Executor 138 finished executing tests.

> Task :beam-sdks-java-io-hadoop-common:test
Build cache key for task ':beam-sdks-java-io-hadoop-common:test' is 
3614d695523fbf02cbf6ba2da1a03b1d
Task ':beam-sdks-java-io-hadoop-common:test' is not up-to-date because:
  No history is available.
Starting process 'Gradle Test Executor 138'. Working directory: 

 Command: /usr/local/asfpackages/java/jdk1.8.0_172/bin/java 
-Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/4.10.2/workerMain/gradle-worker.jar 
worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test 
Executor 138'
Successfully started process 'Gradle Test Executor 138'

org.apache.beam.sdk.io.hadoop.WritableCoderTest > 
testAutomaticRegistrationOfCoderProvider STANDARD_ERROR
log4j:WARN No appenders could be found for logger 
(org.apache.beam.sdk.coders.CoderRegistry).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for 
more info.
Finished generating test XML results (0.0 secs) into: 

Generating HTML test report...
Finished generating test html results (0.002 secs) into: 

Packing task ':beam-sdks-java-io-hadoop-common:test'
:beam-sdks-java-io-hadoop-common:test (Thread[Task worker for ':' Thread 
3,5,main]) completed. Took 1.177 secs.
:beam-sdks-java-io-hadoop-common:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
 (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task 
> :beam-sdks-java-io-hadoop-common:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
Caching disabled for task 
':beam-sdks-java-io-hadoop-common:validateShadedJarDoesntLeakNonOrgApacheBeamClasses':
 Caching has not been enabled for the task
Task 
':beam-sdks-java-io-hadoop-common:validateShadedJarDoesntLeakNonOrgApacheBeamClasses'
 is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-hadoop-common:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
 (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.002 secs.
:beam-sdks-java-io-hadoop-common:check (Thread[Task worker for ':' Thread 
3,5,main]) started.

> Task :beam-sdks-java-io-hadoop-common:check
Skipping task ':beam-sdks-java-io-hadoop-common:check' as it has no actions.
:beam-sdks-java-io-hadoop-common:check (Thread[Task worker for ':' Thread 
3,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-hadoop-common:build (Thread[Task worker for ':' Thread 
9,5,main]) started.

> Task :beam-sdks-java-io-hadoop-common:build
Skipping task ':beam-sdks-java-io-hadoop-common:build' as it has no actions.
:beam-sdks-java-io-hadoop-common:build (Thread[Task worker for ':' Thread 
9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-hadoop-common:buildDependents (Thread[Task worker for ':' 
Thread 9,5,main]) started.

> Task :beam-sdks-java-io-hadoop-common:buildDependents
Caching disabled for task ':beam-sdks-java-io-hadoop-common:buildDependents': 
Caching has not been enabled for the task
Task ':beam-sdks-java-io-hadoop-common:buildDependents' is not up-to-d

Build failed in Jenkins: beam_PostCommit_Python_Verify #6240

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 1.26 MB...]
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
:276:
 DeprecationWarning: Please use assertEqual instead.
  self.assertEquals(len(files), 1)
ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_delete_table_fails_dataset_not_exist 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_fails_service_error 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_fails_table_not_exist 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1338

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 83.32 KB...]
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
  Could not find a version that satisfies the requirement pbr>=0.11 (from 
mock->-r postcommit_requirements.txt (line 2)) (from versions: )
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
No matching distribution found for pbr>=0.11 (from mock->-r 
postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ERROR
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded 
/tmp/dataflow-requirements-cache/setuptools-40.4.3.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))


Build failed in Jenkins: beam_PerformanceTests_AvroIOIT #1096

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 224.28 KB...]
at 
org.gradle.execution.taskgraph.LocalTaskInfoExecutor.execute(LocalTaskInfoExecutor.java:42)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: 
Compilation failed with exit code 1; see the compiler error output for details.
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:74)
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
at 
org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
at 
org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:153)
at 
org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:121)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at 
org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gr

Build failed in Jenkins: beam_PerformanceTests_ParquetIOIT_HDFS #482

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 240.56 KB...]
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: 
Compilation failed with exit code 1; see the compiler error output for details.
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:74)
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
at 
org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
at 
org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:153)
at 
org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:121)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at 
org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)

Build failed in Jenkins: beam_PerformanceTests_Compressed_TextIOIT_HDFS #765

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 244.06 KB...]
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: 
Compilation failed with exit code 1; see the compiler error output for details.
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:74)
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
at 
org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
at 
org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:153)
at 
org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:121)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at 
org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #764

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 240.60 KB...]
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: 
Compilation failed with exit code 1; see the compiler error output for details.
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:74)
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
at 
org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
at 
org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:153)
at 
org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:121)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at 
org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
   

Build failed in Jenkins: beam_PerformanceTests_TextIOIT #1117

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 233.50 KB...]
at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.run(EventFiringTaskExecuter.java:51)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
at 
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
at 
org.gradle.execution.taskgraph.LocalTaskInfoExecutor.execute(LocalTaskInfoExecutor.java:42)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor.process(DefaultTaskPlanExecutor.java:74)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.execute(DefaultTaskExecutionGraph.java:143)
at 
org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:40)
at 
org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:40)
at 
org.gradle.execution.DefaultBuildExecuter.access$000(DefaultBuildExecuter.java:24)
at 
org.gradle.execution.DefaultBuildExecuter$1.proceed(DefaultBuildExecuter.java:46)
at 
org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:49)
at 
org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:40)
at 
org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:33)
at 
org.gradle.initialization.DefaultGradleLauncher$ExecuteTasks.run(DefaultGradleLauncher.java:355)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
at 
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at 
org.gradle.initialization.DefaultGradleLauncher.runTasks(DefaultGradleLauncher.java:219)
at 
org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:149)
at 
org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:124)
at 
org.gradle.internal.invocation.GradleBuildController$1.call(GradleBuildController.java:77)
at 
org.gradle.internal.invocation.GradleBuildController$1.call(GradleBuildController.java:74)
at 
org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:154)
at 
org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:38)
at 
org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:96)
at 
org.gradle

[jira] [Work logged] (BEAM-5698) Migrate Dataflow tests to use a staged dataflow worker jar

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5698?focusedWorklogId=153416&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153416
 ]

ASF GitHub Bot logged work on BEAM-5698:


Author: ASF GitHub Bot
Created on: 11/Oct/18 06:09
Start Date: 11/Oct/18 06:09
Worklog Time Spent: 10m 
  Work Description: HuangLED commented on a change in pull request #6626: 
[BEAM-5698] Migrate Go integration test to use Dataflow worker jar
URL: https://github.com/apache/beam/pull/6626#discussion_r224324481
 
 

 ##
 File path: sdks/go/test/run_integration_tests.sh
 ##
 @@ -79,7 +82,8 @@ echo ">>> RUNNING DATAFLOW INTEGRATION TESTS"
 --environment_config=$CONTAINER:$TAG \
 --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
 --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
---worker_binary=./sdks/go/test/build/bin/linux-amd64/worker
+--worker_binary=./sdks/go/test/build/bin/linux-amd64/worker \
+--dataflow_worker_jar=$DATAFLOW_WORKER_JAR
 
 Review comment:
   I assume this integration test wants to trigger portable runner code path 
(apparently GO does not even have legacy world).
   
   If that is the case, why triggering a job via portable runner does not 
require a docker image in the command line? versus the HOW-TO 
https://g3doc.corp.google.com/cloud/dataflow/portability/g3doc/howto.md?cl=head
   
   I had the hypothesis that, maybe under the hood, what is inside JAR is 
essentially equivalent to what is in docker image. But if that is the case, why 
don't we stick with using docker image.  
   
   What am I missing?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153416)
Time Spent: 1.5h  (was: 1h 20m)

> Migrate Dataflow tests to use a staged dataflow worker jar
> --
>
> Key: BEAM-5698
> URL: https://issues.apache.org/jira/browse/BEAM-5698
> Project: Beam
>  Issue Type: Task
>  Components: runner-dataflow
>Reporter: Henning Rohde
>Priority: Major
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> Needs to be done for all Dataflow testing at HEAD for all SDKs, except legacy 
> Python batch. For java legacy jobs, we should not specify a worker harness 
> container image.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PerformanceTests_JDBC #1198

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 193.54 KB...]
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
at 
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
at 
org.gradle.execution.taskgraph.LocalTaskInfoExecutor.execute(LocalTaskInfoExecutor.java:42)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
at 
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: 
Compilation failed with exit code 1; see the compiler error output for details.
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:74)
at 
net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
at 
org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
at 
org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
at 
org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
at 
org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:153)
at 
org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:121)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at 
org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
at 
org.gradle.internal.operations.DefaultBuildOperationExecu

Build failed in Jenkins: beam_PostCommit_Go_GradleBuild #1251

2018-10-10 Thread Apache Jenkins Server
See 


--
[...truncated 634.34 KB...]
5: Impulse [] -> [Out: []uint8 -> {11: []uint8/bytes GLO}]
6: ParDo [In(Main): []uint8 <- {11: []uint8/bytes GLO} In(Iter): T <- {10: 
int/int[varintz] GLO} In(Iter): T <- {2: int/int[varintz] GLO}] -> [Out: T -> 
{12: int/int[varintz] GLO} Out: T -> {13: int/int[varintz] GLO} Out: T -> {14: 
int/int[varintz] GLO}]
7: ParDo [In(Main): X <- {12: int/int[varintz] GLO}] -> []
8: ParDo [In(Main): X <- {14: int/int[varintz] GLO}] -> []
2018/10/11 06:08:17 Plan[plan]:
12: Impulse[0]
13: Impulse[0]
1: ParDo[passert.failFn] Out:[]
2: Discard
3: ParDo[passert.failFn] Out:[]
4: ParDo[passert.diffFn] Out:[1 2 3]
5: wait[2] Out:4
6: buffer[6]. wait:5 Out:4
7: buffer[7]. wait:5 Out:4
8: Flatten[7]. Out:buffer[6]. wait:5 Out:4
9: ParDo[beam.partitionFn] Out:[8 8 8 8 8 8 8]
10: Multiplex. Out:[9 7]
11: ParDo[beam.createFn] Out:[10]
2018/10/11 06:08:17 wait[5] unblocked w/ 1 [false]
2018/10/11 06:08:17 wait[5] done
--- PASS: TestPartitionFlattenIdentity (0.00s)
=== RUN   Example_metricsDeclaredAnywhere
--- PASS: Example_metricsDeclaredAnywhere (0.00s)
=== RUN   Example_metricsReusable
--- PASS: Example_metricsReusable (0.00s)
PASS
coverage: 44.8% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam 0.016s  coverage: 44.8% of 
statements
=== RUN   TestOptions
--- PASS: TestOptions (0.00s)
=== RUN   TestKey
--- PASS: TestKey (0.00s)
=== RUN   TestRegister
--- PASS: TestRegister (0.00s)
PASS
coverage: 47.1% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/runtime0.003s  
coverage: 47.1% of statements
=== RUN   TestMergeMaps
--- PASS: TestMergeMaps (0.00s)
=== RUN   TestShallowClone
--- PASS: TestShallowClone (0.00s)
=== RUN   TestShallowCloneNil
--- PASS: TestShallowCloneNil (0.00s)
PASS
coverage: 6.4% of statements
ok  github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx  0.003s  
coverage: 6.4% of statements

> Task :beam-sdks-go:test
Test for github.com/apache/beam/sdks/go/pkg/beam finished, 7 completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/runtime:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/runtime finished, 3 
completed, 0 failed
Result of package github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx:
Test for github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx finished, 3 
completed, 0 failed
Generating HTML test report...
Finished generating test html results (0.133 secs) into: 

:beam-sdks-go:test (Thread[Task worker for ':' Thread 4,5,main]) completed. 
Took 23.895 secs.
:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 4,5,main]) 
started.

> Task :beam-sdks-go-container:prepare
Caching disabled for task ':beam-sdks-go-container:prepare': Caching has not 
been enabled for the task
Task ':beam-sdks-go-container:prepare' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Use project GOPATH: 

:beam-sdks-go-container:prepare (Thread[Task worker for ':' Thread 4,5,main]) 
completed. Took 0.001 secs.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 4,5,main]) started.

> Task :beam-sdks-go-container:resolveBuildDependencies UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:resolveBuildDependencies' is 
ce1e671d98152c7df02ca39192fdea06
Caching disabled for task ':beam-sdks-go-container:resolveBuildDependencies': 
Caching has not been enabled for the task
Skipping task ':beam-sdks-go-container:resolveBuildDependencies' as it is 
up-to-date.
:beam-sdks-go-container:resolveBuildDependencies (Thread[Task worker for ':' 
Thread 4,5,main]) completed. Took 0.032 secs.
:beam-sdks-go-container:installDependencies (Thread[Task worker for ':' Thread 
4,5,main]) started.

> Task :beam-sdks-go-container:installDependencies
Caching disabled for task ':beam-sdks-go-container:installDependencies': 
Caching has not been enabled for the task
Task ':beam-sdks-go-container:installDependencies' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-go-container:installDependencies (Thread[Task worker for ':' Thread 
4,5,main]) completed. Took 0.677 secs.
:beam-sdks-go-container:buildLinuxAmd64 (Thread[Task worker for ':' Thread 
4,5,main]) started.

> Task :beam-sdks-go-container:buildLinuxAmd64 UP-TO-DATE
Build cache key for task ':beam-sdks-go-container:buildLinuxAmd64' is 
3a55f1d12988073a3063e22d7fc1f92a
Caching disabled for task ':beam-sdks-go-container:buildLinuxAmd64': Caching 
has not been enabled for the task
Skipping task ':beam-sdks-go-container:buildLinuxAmd64' as it is up-to-date.
:beam-sdks-go-container:buildLinux

[jira] [Updated] (BEAM-5714) RedisIO emit error of EXEC without MULTI

2018-10-10 Thread K.K. POON (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5714?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

K.K. POON updated BEAM-5714:

Description: 
RedisIO has EXEC without MULTI error after SET a batch of records.

 

By looking at the source code, I guess there is missing `pipeline.multi();` 
after exec() the last batch.

[https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555]

  was:
RedisIO has EXEC without MULTI error after set a batch of records.

 

By looking at the source code, I guess there is missing `pipeline.multi();` 
after exec() the last batch.

[https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555]


> RedisIO emit error of EXEC without MULTI
> 
>
> Key: BEAM-5714
> URL: https://issues.apache.org/jira/browse/BEAM-5714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-redis
>Affects Versions: 2.7.0
>Reporter: K.K. POON
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> RedisIO has EXEC without MULTI error after SET a batch of records.
>  
> By looking at the source code, I guess there is missing `pipeline.multi();` 
> after exec() the last batch.
> [https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #325

2018-10-10 Thread Apache Jenkins Server
See 


--
[...truncated 256.08 KB...]
Collecting funcsigs>=1; python_version < "3.3" (from mock==2.0.0->-r 
/tmp/base_image_requirements.txt (line 36))
  Downloading 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock==2.0.0->-r /tmp/base_image_requirements.txt 
(line 36))
  Downloading 
https://files.pythonhosted.org/packages/01/0a/1e81639e7ed6aa51554ab05827984d07885d6873e612a97268ab3d80c73f/pbr-4.3.0-py2.py3-none-any.whl
 (106kB)
Collecting pyasn1>=0.1.7 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/d1/a1/7790cc85db38daa874f6a2e6308131b9953feb1367f2ae2d1123bb93a9f5/pyasn1-0.4.4-py2.py3-none-any.whl
 (72kB)
Collecting pyasn1-modules>=0.0.5 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/19/02/fa63f7ba30a0d7b925ca29d034510fc1ffde53264b71b4155022ddf3ab5d/pyasn1_modules-0.2.2-py2.py3-none-any.whl
 (62kB)
Collecting rsa>=3.1.4 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools==0.5.20->-r 
/tmp/base_image_requirements.txt (line 48))
  Downloading 
https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.1 (from 
google-cloud-pubsub==0.35.4->-r /tmp/base_image_requirements.txt (line 50))
  Downloading 
https://files.pythonhosted.org/packages/9b/28/f26f67381cb23e81271b8d66c00a846ad9d25a909ae1ae1df8222fad2744/grpc-google-iam-v1-0.11.4.tar.gz
Collecting google-api-core[grpc]<2.0.0dev,>=0.1.3 (from 
google-cloud-pubsub==0.35.4->-r /tmp/base_image_requirements.txt (line 50))
  Downloading 
https://files.pythonhosted.org/packages/a5/be/de30100034c391f4c56e2543f1507eb1b30b3030bd9a6764dd6cfe7a954e/google_api_core-1.4.1-py2.py3-none-any.whl
 (53kB)
Collecting google-cloud-core<0.26dev,>=0.25.0 (from 
google-cloud-bigquery==0.25.0->-r /tmp/base_image_requirements.txt (line 51))
  Downloading 
https://files.pythonhosted.org/packages/ef/dd/00e90bd1f6788f06ca5ea83a0ec8dd76350b38303bb8f09d2bf692eb1294/google_cloud_core-0.25.0-py2.py3-none-any.whl
 (52kB)
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1==0.90.4->-r /tmp/base_image_requirements.txt 
(line 52))
  Downloading 
https://files.pythonhosted.org/packages/00/03/d25bed04ec8d930bcfa488ba81a2ecbf7eb36ae3ffd7e8f5be0d036a89c9/googleapis-common-protos-1.5.3.tar.gz
Collecting python-dateutil>=2.5.0 (from pandas==0.23.4->-r 
/tmp/base_image_requirements.txt (line 61))
  Downloading 
https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl
 (211kB)
Collecting astor>=0.6.0 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/35/6b/11530768cac581a12952a2aad00e1526b89d242d0b9f59534ef6e6a1752f/astor-0.7.1-py2.py3-none-any.whl
Collecting gast>=0.2.0 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/5c/78/ff794fcae2ce8aa6323e789d1f8b3b7765f601e7702726f430e814822b96/gast-0.2.0.tar.gz
Collecting keras-applications>=1.0.5 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/3f/c4/2ff40221029f7098d58f8d7fb99b97e8100f3293f9856f0fb5834bef100b/Keras_Applications-1.0.6-py2.py3-none-any.whl
 (44kB)
Collecting keras-preprocessing>=1.0.3 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/fc/94/74e0fa783d3fc07e41715973435dd051ca89c550881b3454233c39c73e69/Keras_Preprocessing-1.0.5-py2.py3-none-any.whl
Collecting absl-py>=0.1.6 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/16/db/cce5331638138c178dd1d5fb69f3f55eb3787a12efd9177177ae203e847f/absl-py-0.5.0.tar.gz
 (90kB)
Collecting backports.weakref>=1.0rc1 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/88/ec/f598b633c3d5ffe267aaada57d961c94fdfa183c5c3ebda2b6d151943db6/backports.weakref-1.0.post1-py2.py3-none-any.whl
Requirement already satisfied: wheel in /usr/local/lib/python2.7/site-packages 
(from tensorflow==1.11.0->-r /tmp/base_image_requirements.txt (line 65)) 
(0.32.1)
Collecting tensorboa

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #877

2018-10-10 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1241

2018-10-10 Thread Apache Jenkins Server
See 


--
[...truncated 60.72 KB...]
Skipping task ':beam-model-job-management:classes' as it has no actions.
:beam-model-job-management:classes (Thread[Task worker for ':' Thread 
26,5,main]) completed. Took 0.0 secs.

> Task :beam-model-pipeline:shadowJar
Build cache key for task ':beam-model-pipeline:shadowJar' is 
63b267f240be79a047f99b9e61ceb4be
Caching disabled for task ':beam-model-pipeline:shadowJar': Caching has not 
been enabled for the task
Task ':beam-model-pipeline:shadowJar' is not up-to-date because:
  No history is available.

> Task :beam-sdks-java-core:shadowJar
Build cache key for task ':beam-sdks-java-core:shadowJar' is 
57217c4b9f05ab74db24c0268e8a333f
Caching disabled for task ':beam-sdks-java-core:shadowJar': Caching has not 
been enabled for the task
Task ':beam-sdks-java-core:shadowJar' is not up-to-date because:
  No history is available.
***
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 6.857s [6857ms]
Average Time/Jar: 1.14283s [1142.83ms]
***
:beam-sdks-java-core:shadowJar (Thread[Task worker for ':' Thread 3,5,main]) 
completed. Took 10.316 secs.
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 3,5,main]) started.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 14,5,main]) started.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 79,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroProtocol NO-SOURCE
Skipping task ':beam-sdks-java-core:generateTestAvroProtocol' as it has no 
source files and no previous output files.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 79,5,main]) completed. Took 0.002 secs.
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
79,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
Build cache key for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto' is 
fd62c226ef2be1384d6275fbcc2d2184
Caching disabled for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto': Caching has not been 
enabled for the task
Task ':beam-sdks-java-extensions-protobuf:extractIncludeProto' is not 
up-to-date because:
  No history is available.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 14,5,main]) completed. Took 0.304 secs.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 14,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE
file or directory 
'
 not found
Skipping task ':beam-sdks-java-extensions-protobuf:generateProto' as it has no 
source files and no previous output files.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 14,5,main]) completed. Took 0.001 secs.
:beam-sdks-java-extensions-protobuf:compileJava (Thread[Task worker for ':' 
Thread 55,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroJava
Build cache key for task ':beam-sdks-java-core:generateTestAvroJava' is 
e485090fed57ed9406a491ee86e365cc
Caching disabled for task ':beam-sdks-java-core:generateTestAvroJava': Caching 
has not been enabled for the task
Task ':beam-sdks-java-core:generateTestAvroJava' is not up-to-date because:
  No history is available.
Found 1 files
Processed src/test/avro/org/apache/beam/sdk/io/user.avsc
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
79,5,main]) completed. Took 0.616 secs.
:beam-sdks-java-core:compileTestJava (Thread[Task worker for ':' Thread 
79,5,main]) started.

> Task :beam-model-pipeline:shadowJar
***
GRADLE SHADOW STATS

Total Jars: 36 (includes project)
Total Time: 9.833s [9833ms]
Average Time/Jar: 0.273138889s [273.138889ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':' Thread 21,5,main]) 
completed. Took 11.883 secs.
:beam-model-job-management:shadowJar (Thread[Task worker for ':' Thread 
21,5,main]) started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/4.10.2/fileHashes/fileHashes.bin
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/4.10.2/fileHashes/resourceHashesCache.bin
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
f3d927ba4b26afffd38f9b2b8c667975
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-cloud-pla

Build failed in Jenkins: beam_PreCommit_Website_Stage_GCS_Cron #27

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 528.24 KB...]
NotFoundException: 404 The destination bucket 
gs://apache-beam-website-pull-requests does not exist or the write to the 
destination must be restarted
NotFoundException: 404 The destination bucket 
gs://apache-beam-website-pull-requests does not exist or the write to the 
destination must be restarted
Copying 
file://
 [Content-Type=text/html]...
Copying 
file://
 [Content-Type=text/html]...
Copying 
file://
 [Content-Type=text/html]...
Copying 
file://
 [Content-Type=text/html]...
Copying 
file://
 [Content-Type=text/html]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=text/html]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=application/font-sfnt]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=application/vnd.ms-fontobject]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=application/font-woff]...
Copying 
file://
 [Content-Type=application/octet-stream]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=image/svg+xml]...
\ [0/326 files][ 25.8 MiB/ 27.0 MiB]  95% Done  
Copying 
file://
 [Content-Type=text/html]...
\ [0/326 files][ 25.9 MiB/ 27.0 MiB]  95% Done  
NotFoundException: 404 The destination bucket 
gs://apache-beam-website-pull-requests does not exist or the write to the 
destination must be restarted
Copying 
file://
 [Content-Type=text/html]...
\ [0/326 files][ 25.9 MiB/ 27.0 MiB]  96% Done  
NotFoundException: 404 The destination bucket 
gs://apache-beam-website-pull-requests does not exist or the write to the 
destination must be restarted
NotFoundExceptio

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #324

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 257.68 KB...]
Collecting funcsigs>=1; python_version < "3.3" (from mock==2.0.0->-r 
/tmp/base_image_requirements.txt (line 36))
  Downloading 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock==2.0.0->-r /tmp/base_image_requirements.txt 
(line 36))
  Downloading 
https://files.pythonhosted.org/packages/01/0a/1e81639e7ed6aa51554ab05827984d07885d6873e612a97268ab3d80c73f/pbr-4.3.0-py2.py3-none-any.whl
 (106kB)
Collecting pyasn1>=0.1.7 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/d1/a1/7790cc85db38daa874f6a2e6308131b9953feb1367f2ae2d1123bb93a9f5/pyasn1-0.4.4-py2.py3-none-any.whl
 (72kB)
Collecting pyasn1-modules>=0.0.5 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/19/02/fa63f7ba30a0d7b925ca29d034510fc1ffde53264b71b4155022ddf3ab5d/pyasn1_modules-0.2.2-py2.py3-none-any.whl
 (62kB)
Collecting rsa>=3.1.4 (from oauth2client==3.0.0->-r 
/tmp/base_image_requirements.txt (line 37))
  Downloading 
https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools==0.5.20->-r 
/tmp/base_image_requirements.txt (line 48))
  Downloading 
https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.1 (from 
google-cloud-pubsub==0.35.4->-r /tmp/base_image_requirements.txt (line 50))
  Downloading 
https://files.pythonhosted.org/packages/9b/28/f26f67381cb23e81271b8d66c00a846ad9d25a909ae1ae1df8222fad2744/grpc-google-iam-v1-0.11.4.tar.gz
Collecting google-api-core[grpc]<2.0.0dev,>=0.1.3 (from 
google-cloud-pubsub==0.35.4->-r /tmp/base_image_requirements.txt (line 50))
  Downloading 
https://files.pythonhosted.org/packages/a5/be/de30100034c391f4c56e2543f1507eb1b30b3030bd9a6764dd6cfe7a954e/google_api_core-1.4.1-py2.py3-none-any.whl
 (53kB)
Collecting google-cloud-core<0.26dev,>=0.25.0 (from 
google-cloud-bigquery==0.25.0->-r /tmp/base_image_requirements.txt (line 51))
  Downloading 
https://files.pythonhosted.org/packages/ef/dd/00e90bd1f6788f06ca5ea83a0ec8dd76350b38303bb8f09d2bf692eb1294/google_cloud_core-0.25.0-py2.py3-none-any.whl
 (52kB)
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from 
proto-google-cloud-datastore-v1==0.90.4->-r /tmp/base_image_requirements.txt 
(line 52))
  Downloading 
https://files.pythonhosted.org/packages/00/03/d25bed04ec8d930bcfa488ba81a2ecbf7eb36ae3ffd7e8f5be0d036a89c9/googleapis-common-protos-1.5.3.tar.gz
Collecting python-dateutil>=2.5.0 (from pandas==0.23.4->-r 
/tmp/base_image_requirements.txt (line 61))
  Downloading 
https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl
 (211kB)
Collecting astor>=0.6.0 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/35/6b/11530768cac581a12952a2aad00e1526b89d242d0b9f59534ef6e6a1752f/astor-0.7.1-py2.py3-none-any.whl
Collecting gast>=0.2.0 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/5c/78/ff794fcae2ce8aa6323e789d1f8b3b7765f601e7702726f430e814822b96/gast-0.2.0.tar.gz
Collecting backports.weakref>=1.0rc1 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/88/ec/f598b633c3d5ffe267aaada57d961c94fdfa183c5c3ebda2b6d151943db6/backports.weakref-1.0.post1-py2.py3-none-any.whl
Collecting keras-applications>=1.0.5 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/3f/c4/2ff40221029f7098d58f8d7fb99b97e8100f3293f9856f0fb5834bef100b/Keras_Applications-1.0.6-py2.py3-none-any.whl
 (44kB)
Collecting keras-preprocessing>=1.0.3 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/fc/94/74e0fa783d3fc07e41715973435dd051ca89c550881b3454233c39c73e69/Keras_Preprocessing-1.0.5-py2.py3-none-any.whl
Collecting absl-py>=0.1.6 (from tensorflow==1.11.0->-r 
/tmp/base_image_requirements.txt (line 65))
  Downloading 
https://files.pythonhosted.org/packages/16/db/cce5331638138c178dd1d5fb69f3f55eb3787a12efd9177177ae203e847f/absl-py-0.5.0.tar.gz
 (90kB)
Collecting termcolor>=1.1.0 (from tensorflow==1.11.0->-r 
/tmp/base_image_require

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1240

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 59.23 KB...]

> Task :beam-sdks-java-core:classes
Skipping task ':beam-sdks-java-core:classes' as it has no actions.
:beam-sdks-java-core:classes (Thread[Task worker for ':' Thread 6,5,main]) 
completed. Took 0.0 secs.
:beam-sdks-java-core:shadowJar (Thread[Task worker for ':' Thread 6,5,main]) 
started.

> Task :beam-model-pipeline:shadowJar
Build cache key for task ':beam-model-pipeline:shadowJar' is 
63b267f240be79a047f99b9e61ceb4be
Caching disabled for task ':beam-model-pipeline:shadowJar': Caching has not 
been enabled for the task
Task ':beam-model-pipeline:shadowJar' is not up-to-date because:
  No history is available.

> Task :beam-sdks-java-core:shadowJar
Build cache key for task ':beam-sdks-java-core:shadowJar' is 
46722401596192df1dc876397bac9469
Caching disabled for task ':beam-sdks-java-core:shadowJar': Caching has not 
been enabled for the task
Task ':beam-sdks-java-core:shadowJar' is not up-to-date because:
  No history is available.
***
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 6.128s [6128ms]
Average Time/Jar: 1.02133s [1021.33ms]
***
:beam-sdks-java-core:shadowJar (Thread[Task worker for ':' Thread 6,5,main]) 
completed. Took 8.87 secs.
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 6,5,main]) started.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 95,5,main]) started.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 4,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroProtocol NO-SOURCE
Skipping task ':beam-sdks-java-core:generateTestAvroProtocol' as it has no 
source files and no previous output files.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 95,5,main]) completed. Took 0.003 secs.
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
95,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
Build cache key for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto' is 
edee50e5587240849be1305aaacdd5bc
Caching disabled for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto': Caching has not been 
enabled for the task
Task ':beam-sdks-java-extensions-protobuf:extractIncludeProto' is not 
up-to-date because:
  No history is available.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 4,5,main]) completed. Took 0.272 secs.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 4,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE
file or directory 
'
 not found
Skipping task ':beam-sdks-java-extensions-protobuf:generateProto' as it has no 
source files and no previous output files.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 4,5,main]) completed. Took 0.001 secs.
:beam-sdks-java-extensions-protobuf:compileJava (Thread[Task worker for ':' 
Thread 4,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroJava
Build cache key for task ':beam-sdks-java-core:generateTestAvroJava' is 
e485090fed57ed9406a491ee86e365cc
Caching disabled for task ':beam-sdks-java-core:generateTestAvroJava': Caching 
has not been enabled for the task
Task ':beam-sdks-java-core:generateTestAvroJava' is not up-to-date because:
  No history is available.
Found 1 files
Processed src/test/avro/org/apache/beam/sdk/io/user.avsc
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
95,5,main]) completed. Took 0.634 secs.
:beam-sdks-java-core:compileTestJava (Thread[Task worker for ':' Thread 
95,5,main]) started.

> Task :beam-model-pipeline:shadowJar
***
GRADLE SHADOW STATS

Total Jars: 36 (includes project)
Total Time: 8.747s [8747ms]
Average Time/Jar: 0.24297s [242.97ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':' Thread 27,5,main]) 
completed. Took 10.51 secs.
:beam-model-job-management:shadowJar (Thread[Task worker for ':' Thread 
27,5,main]) started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
1c22a284a50bcac8123ab9ab74bedce7
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-clo

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #876

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Make it possible to run unskip Py3 tests by setting an environment

--
[...truncated 77.02 KB...]
Average Time/Jar: 0.22683s [226.83ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':',5,main]) completed. 
Took 9.823 secs.
:beam-model-fn-execution:shadowJar (Thread[Task worker for ':',5,main]) started.
:beam-model-job-management:shadowJar (Thread[Task worker for ':' Thread 
7,5,main]) started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
bb7fa6c45b12493717b7172ff5e0d189
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
:542:
 warning: [EqualsGetClass] Overriding Object#equals in a non-final class by 
using getClass rather than instanceof breaks substitutability of subclasses.
  public boolean equals(Object o) {
 ^
(see https://errorprone.info/bugpattern/EqualsGetClass)
  Did you mean 'if (!(o instanceof GcsPath)) {'?
error: warnings found and -Werror specified
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 error
1 warning

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava FAILED
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 9,5,main]) completed. Took 9.092 secs.

> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava
file or directory 
'
 not found
Build cache key for task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava' is 
1b38d2d9c0b04c36a73d3ad0a3b867da
Task ':beam-vendor-sdks-java-extensions-protobuf:compileJava' is not up-to-date 
because:
  No history is available.
Custom actions are attached to task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
file or directory 
'
 not found
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-core:compileTestJava FAILED
Build cache key for task ':beam-sdks-java-core:compileTestJava' is 
30cc2373eb5e6eeb8184669871d79fa2
Task ':beam-sdks-java-core:compileTestJava' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':beam-sdks-java-core:compileTestJava'.
All input files are considered out-of-date for incremental task 
':beam-sdks-java-core:compileTestJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
:105:
 warning: [EqualsGetClass] Overriding Object#equals in a non-final class by 
using getClass rather than instanceof breaks substitutability of subclasses.
public boolean equals(Object o) {
   ^
(see https://errorprone.info/bugpattern/EqualsGetClass)
  Did you mean 'if (!(o instanceof Pojo)) {'?
:288:
 warning: [UnnecessaryParentheses] Unnecessary use of groupin

[jira] [Work logged] (BEAM-1251) Python 3 Support

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-1251?focusedWorklogId=153408&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153408
 ]

ASF GitHub Bot logged work on BEAM-1251:


Author: ASF GitHub Bot
Created on: 11/Oct/18 05:40
Start Date: 11/Oct/18 05:40
Worklog Time Spent: 10m 
  Work Description: manuzhang closed pull request #6650: [BEAM-1251] Make 
it possible to unskip Py3 tests by setting an environment variable.
URL: https://github.com/apache/beam/pull/6650
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/pipeline_test.py 
b/sdks/python/apache_beam/pipeline_test.py
index 6397820e6e9..41f749d2f79 100644
--- a/sdks/python/apache_beam/pipeline_test.py
+++ b/sdks/python/apache_beam/pipeline_test.py
@@ -21,6 +21,7 @@
 
 import copy
 import logging
+import os
 import platform
 import sys
 import unittest
@@ -391,8 +392,9 @@ def process(self, element):
 assert_that(pcoll, equal_to([11, 12]))
 pipeline.run()
 
-  @unittest.skipIf(sys.version_info[0] == 3, 'This test still needs to be '
- 'fixed on Python 3')
+  @unittest.skipIf(sys.version_info[0] == 3 and
+   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
+   'This test still needs to be fixed on Python 3.')
   def test_side_input_no_tag(self):
 class TestDoFn(DoFn):
   def process(self, element, prefix, suffix):
@@ -408,8 +410,9 @@ def process(self, element, prefix, suffix):
 assert_that(result, equal_to(['zyx-%s-xyz' % x for x in words_list]))
 pipeline.run()
 
-  @unittest.skipIf(sys.version_info[0] == 3, 'This test still needs to be '
- 'fixed on Python 3')
+  @unittest.skipIf(sys.version_info[0] == 3 and
+   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
+   'This test still needs to be fixed on Python 3.')
   def test_side_input_tagged(self):
 class TestDoFn(DoFn):
   def process(self, element, prefix, suffix=DoFn.SideInputParam):
diff --git 
a/sdks/python/apache_beam/runners/interactive/interactive_runner_test.py 
b/sdks/python/apache_beam/runners/interactive/interactive_runner_test.py
index b21135c8717..767e06e527f 100644
--- a/sdks/python/apache_beam/runners/interactive/interactive_runner_test.py
+++ b/sdks/python/apache_beam/runners/interactive/interactive_runner_test.py
@@ -24,6 +24,7 @@
 from __future__ import division
 from __future__ import print_function
 
+import os
 import sys
 import unittest
 
@@ -43,8 +44,9 @@ def printer(elem):
 
 class InteractiveRunnerTest(unittest.TestCase):
 
-  @unittest.skipIf(sys.version_info[0] == 3, 'This test still needs to be '
- 'fixed on Python 3')
+  @unittest.skipIf(sys.version_info[0] == 3 and
+   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
+   'This test still needs to be fixed on Python 3.')
   def test_basic(self):
 p = beam.Pipeline(
 runner=interactive_runner.InteractiveRunner(
@@ -60,8 +62,9 @@ def test_basic(self):
 _ = pc0 | 'Print3' >> beam.Map(print_with_message('Run3'))
 p.run().wait_until_finish()
 
-  @unittest.skipIf(sys.version_info[0] == 3, 'This test still needs to be '
- 'fixed on Python 3')
+  @unittest.skipIf(sys.version_info[0] == 3 and
+   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
+   'This test still needs to be fixed on Python 3.')
   def test_wordcount(self):
 
 class WordExtractingDoFn(beam.DoFn):
diff --git 
a/sdks/python/apache_beam/runners/interactive/pipeline_analyzer_test.py 
b/sdks/python/apache_beam/runners/interactive/pipeline_analyzer_test.py
index f2d82cd3288..caefbe04ba6 100644
--- a/sdks/python/apache_beam/runners/interactive/pipeline_analyzer_test.py
+++ b/sdks/python/apache_beam/runners/interactive/pipeline_analyzer_test.py
@@ -24,6 +24,7 @@
 from __future__ import division
 from __future__ import print_function
 
+import os
 import sys
 import unittest
 
@@ -87,8 +88,9 @@ def assertTransformEqual(self, pipeline_proto1, transform_id1,
 self.assertSetEqual(set(transform_proto1.outputs),
 set(transform_proto2.outputs))
 
-  @unittest.skipIf(sys.version_info[0] == 3, 'This test still needs to be '
- 'fixed on Python 3')
+  @unittest.skipIf(sys.version_info[0] == 3 and
+   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
+   'This test still needs to be fixed on Python 3.')
   def test_basic(self):
 p = beam.Pipeli

[beam] 01/01: Merge pull request #6650: [BEAM-1251] Make it possible to unskip Py3 tests by setting an environment variable.

2018-10-10 Thread mauzhang
This is an automated email from the ASF dual-hosted git repository.

mauzhang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 3389b47f507ce0977579c673108ab553773b0aba
Merge: 9084dbb 4cc48f8
Author: Manu Zhang 
AuthorDate: Thu Oct 11 13:40:47 2018 +0800

Merge pull request #6650: [BEAM-1251] Make it possible to unskip Py3 tests 
by setting an environment variable.

 sdks/python/apache_beam/pipeline_test.py   | 11 ---
 .../runners/interactive/interactive_runner_test.py | 11 ---
 .../runners/interactive/pipeline_analyzer_test.py  | 16 ++
 .../runners/portability/fn_api_runner_test.py  | 35 +-
 .../apache_beam/runners/worker/opcounters_test.py  |  6 ++--
 .../typehints/native_type_compatibility_test.py| 11 ---
 .../typehints/trivial_inference_test.py|  6 ++--
 .../apache_beam/typehints/typed_pipeline_test.py   | 21 -
 8 files changed, 73 insertions(+), 44 deletions(-)



[beam] branch master updated (9084dbb -> 3389b47)

2018-10-10 Thread mauzhang
This is an automated email from the ASF dual-hosted git repository.

mauzhang pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 9084dbb  Merge pull request #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3.
 add 4cc48f8  Make it possible to run unskip Py3 tests by setting an 
environment variable.
 new 3389b47  Merge pull request #6650: [BEAM-1251] Make it possible to 
unskip Py3 tests by setting an environment variable.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 sdks/python/apache_beam/pipeline_test.py   | 11 ---
 .../runners/interactive/interactive_runner_test.py | 11 ---
 .../runners/interactive/pipeline_analyzer_test.py  | 16 ++
 .../runners/portability/fn_api_runner_test.py  | 35 +-
 .../apache_beam/runners/worker/opcounters_test.py  |  6 ++--
 .../typehints/native_type_compatibility_test.py| 11 ---
 .../typehints/trivial_inference_test.py|  6 ++--
 .../apache_beam/typehints/typed_pipeline_test.py   | 21 -
 8 files changed, 73 insertions(+), 44 deletions(-)



[jira] [Work logged] (BEAM-1251) Python 3 Support

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-1251?focusedWorklogId=153407&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153407
 ]

ASF GitHub Bot logged work on BEAM-1251:


Author: ASF GitHub Bot
Created on: 11/Oct/18 05:40
Start Date: 11/Oct/18 05:40
Worklog Time Spent: 10m 
  Work Description: manuzhang commented on issue #6650: [BEAM-1251] Make it 
possible to unskip Py3 tests by setting an environment variable.
URL: https://github.com/apache/beam/pull/6650#issuecomment-428827423
 
 
   LGTM


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153407)
Time Spent: 21h 50m  (was: 21h 40m)

> Python 3 Support
> 
>
> Key: BEAM-1251
> URL: https://issues.apache.org/jira/browse/BEAM-1251
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Eyad Sibai
>Assignee: Robbe
>Priority: Major
>  Time Spent: 21h 50m
>  Remaining Estimate: 0h
>
> I have been trying to use google datalab with python3. As I see there are 
> several packages that does not support python3 yet which google datalab 
> depends on. This is one of them.
> https://github.com/GoogleCloudPlatform/DataflowPythonSDK/issues/6



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-5663) Add tox suites for various Python 3 versions

2018-10-10 Thread Manu Zhang (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5663?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645989#comment-16645989
 ] 

Manu Zhang commented on BEAM-5663:
--

Do you (or anyone) know how to set up Python version on Jenkins ?

> Add tox suites for various Python 3 versions
> 
>
> Key: BEAM-5663
> URL: https://issues.apache.org/jira/browse/BEAM-5663
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Manu Zhang
>Priority: Major
>
> Currently, Python 3.5.2 is set up for Jenkins tests but we've seen test 
> failings across various Python 3 versions. It will be valuable to add tox 
> suites for Python 3.4, 3.5, 3.6 and 3.7



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1239

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

--
[...truncated 59.99 KB...]
:beam-sdks-java-core:shadowJar (Thread[Task worker for ':' Thread 2,5,main]) 
started.

> Task :beam-model-pipeline:shadowJar
Build cache key for task ':beam-model-pipeline:shadowJar' is 
63b267f240be79a047f99b9e61ceb4be
Caching disabled for task ':beam-model-pipeline:shadowJar': Caching has not 
been enabled for the task
Task ':beam-model-pipeline:shadowJar' is not up-to-date because:
  No history is available.

> Task :beam-sdks-java-core:shadowJar
Build cache key for task ':beam-sdks-java-core:shadowJar' is 
1a67ef62147c98550a79890c9a7e3f28
Caching disabled for task ':beam-sdks-java-core:shadowJar': Caching has not 
been enabled for the task
Task ':beam-sdks-java-core:shadowJar' is not up-to-date because:
  No history is available.
***
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 6.165s [6165ms]
Average Time/Jar: 1.0275s [1027.5ms]
***
:beam-sdks-java-core:shadowJar (Thread[Task worker for ':' Thread 2,5,main]) 
completed. Took 8.91 secs.
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 2,5,main]) started.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 59,5,main]) started.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 41,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroProtocol NO-SOURCE
Skipping task ':beam-sdks-java-core:generateTestAvroProtocol' as it has no 
source files and no previous output files.
:beam-sdks-java-core:generateTestAvroProtocol (Thread[Task worker for ':' 
Thread 41,5,main]) completed. Took 0.005 secs.
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
41,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
Build cache key for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto' is 
013af3af620281b2a1a4aec0fc7fd816
Caching disabled for task 
':beam-sdks-java-extensions-protobuf:extractIncludeProto': Caching has not been 
enabled for the task
Task ':beam-sdks-java-extensions-protobuf:extractIncludeProto' is not 
up-to-date because:
  No history is available.
:beam-sdks-java-extensions-protobuf:extractIncludeProto (Thread[Task worker for 
':' Thread 59,5,main]) completed. Took 0.27 secs.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 59,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE
file or directory 
'
 not found
Skipping task ':beam-sdks-java-extensions-protobuf:generateProto' as it has no 
source files and no previous output files.
:beam-sdks-java-extensions-protobuf:generateProto (Thread[Task worker for ':' 
Thread 59,5,main]) completed. Took 0.001 secs.
:beam-sdks-java-extensions-protobuf:compileJava (Thread[Task worker for ':' 
Thread 26,5,main]) started.

> Task :beam-sdks-java-core:generateTestAvroJava
Build cache key for task ':beam-sdks-java-core:generateTestAvroJava' is 
e485090fed57ed9406a491ee86e365cc
Caching disabled for task ':beam-sdks-java-core:generateTestAvroJava': Caching 
has not been enabled for the task
Task ':beam-sdks-java-core:generateTestAvroJava' is not up-to-date because:
  No history is available.
Found 1 files
Processed src/test/avro/org/apache/beam/sdk/io/user.avsc
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
41,5,main]) completed. Took 0.67 secs.
:beam-sdks-java-core:compileTestJava (Thread[Task worker for ':' Thread 
41,5,main]) started.

> Task :beam-model-pipeline:shadowJar
***
GRADLE SHADOW STATS

Total Jars: 36 (includes project)
Total Time: 8.862s [8862ms]
Average Time/Jar: 0.24617s [246.17ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':' Thread 104,5,main]) 
completed. Took 10.634 secs.
:beam-model-job-management:shadowJar (Thread[Task worker for ':' Thread 
104,5,main]) started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
8e92f81232d5aaa444d57cbb9ca59ba4
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
All input files are considered out-of-date for incremental ta

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1238

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

--
[...truncated 19.18 MB...]
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create123/Read(CreateSource) as step s10
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding OutputSideInputs as step s11
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Window.Into()/Window.Assign as step 
s12
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) as step 
s13
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map 
as step s14
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign as step 
s15
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey as step 
s16
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map as 
step s17
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/RewindowActuals/Window.Assign as step 
s18
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map as step s19
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveActualsTriggering/Flatten.PCollections as step 
s20
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Create.Values/Read(CreateSource) as 
step s21
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign as step 
s22
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveDummyTriggering/Flatten.PCollections as step s23
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/FlattenDummyAndContents as step s24
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/NeverTrigger/Flatten.PCollections as 
step s25
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GroupDummyAndContents as step s26
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Values/Values/Map as step s27
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/ParDo(Concat) as step s28
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GetPane/Map as step s29
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/RunChecks as step s30
Oct 11, 2018 4:58:42 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/VerifyAssertions/ParDo(DefaultConclude) as step s31
Oct 11, 2018 4:58:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-1011045836-6195c330/output/results/staging/
Oct 11, 2018 4:58:42 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <70577 bytes, hash 2xTMrYn-QRHxcVqHFaWRJg> to 
gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-1011045836-6195c330/output/results/staging/pipeline-2xTMrYn-QRHxcVqHFaWRJg.pb

org.apac

[jira] [Work logged] (BEAM-5467) Python Flink ValidatesRunner job fixes

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5467?focusedWorklogId=153405&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153405
 ]

ASF GitHub Bot logged work on BEAM-5467:


Author: ASF GitHub Bot
Created on: 11/Oct/18 04:56
Start Date: 11/Oct/18 04:56
Worklog Time Spent: 10m 
  Work Description: tweise commented on issue #6532: [BEAM-5467] Use 
process SDKHarness to run flink PVR tests.
URL: https://github.com/apache/beam/pull/6532#issuecomment-428820643
 
 
   @angoenka @mxm the relevant JIRA for follow-up work / custom environments is 
https://issues.apache.org/jira/browse/BEAM-4819 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153405)
Time Spent: 9h 10m  (was: 9h)

> Python Flink ValidatesRunner job fixes
> --
>
> Key: BEAM-5467
> URL: https://issues.apache.org/jira/browse/BEAM-5467
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Thomas Weise
>Assignee: Thomas Weise
>Priority: Minor
>  Labels: portability-flink
>  Time Spent: 9h 10m
>  Remaining Estimate: 0h
>
> Add status to README
> Rename script and job for consistency
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-4819) Make portable Flink runner JobBundleFactory configurable

2018-10-10 Thread Thomas Weise (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-4819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645975#comment-16645975
 ] 

Thomas Weise commented on BEAM-4819:


A good solution might be AutoService annotated Environment providers that can 
be referenced in pipeline options with custom environment type.

 

> Make portable Flink runner JobBundleFactory configurable
> 
>
> Key: BEAM-4819
> URL: https://issues.apache.org/jira/browse/BEAM-4819
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Thomas Weise
>Priority: Major
>  Labels: portability, portability-flink
>
> BEAM-4791 introduces factory override for testing, expand that to allow users 
> to configure a different factory via service loader to adopt alternative 
> execution environments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-5663) Add tox suites for various Python 3 versions

2018-10-10 Thread Valentyn Tymofieiev (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5663?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645955#comment-16645955
 ] 

Valentyn Tymofieiev commented on BEAM-5663:
---

Another interesting question whether we can easily make these suites run in 
parallel so that we don't slow down precommits too much.


> Add tox suites for various Python 3 versions
> 
>
> Key: BEAM-5663
> URL: https://issues.apache.org/jira/browse/BEAM-5663
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Manu Zhang
>Priority: Major
>
> Currently, Python 3.5.2 is set up for Jenkins tests but we've seen test 
> failings across various Python 3 versions. It will be valuable to add tox 
> suites for Python 3.4, 3.5, 3.6 and 3.7



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1662

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] [BEAM-5700] remove the extra licenses from python bigquery IT

[github] [BEAM-5681] Fix website tasks when pull-request ID is specified

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

--
[...truncated 50.68 MB...]
at io.grpc.Status.asRuntimeException(Status.java:526)
at 
io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:468)
at 
io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at 
io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at 
com.google.cloud.spanner.spi.v1.SpannerErrorInterceptor$1$1.onClose(SpannerErrorInterceptor.java:100)
at 
io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at 
io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at 
com.google.cloud.spanner.spi.v1.WatchdogInterceptor$MonitoredCall$1.onClose(WatchdogInterceptor.java:190)
at 
io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at 
io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at 
io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:684)
at 
io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at 
io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:403)
at 
io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at 
io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
... 3 more

Oct 11, 2018 4:13:23 AM 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn processElement
WARNING: Failed to submit the mutation group
com.google.cloud.spanner.SpannerException: FAILED_PRECONDITION: 
io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Value must not be NULL in 
table users.
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:43)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:80)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:456)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.commit(GrpcSpannerRpc.java:404)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl$2.call(SpannerImpl.java:797)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl$2.call(SpannerImpl.java:794)
at 
com.google.cloud.spanner.SpannerImpl.runWithRetries(SpannerImpl.java:227)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl.writeAtLeastOnce(SpannerImpl.java:793)
at 
com.google.cloud.spanner.SessionPool$PooledSession.writeAtLeastOnce(SessionPool.java:319)
at 
com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:60)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn.processElement(SpannerIO.java:1108)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at 
org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at 
org.apache.beam.repackaged.beam_runners_dir

[jira] [Work logged] (BEAM-5623) Several IO tests hang indefinitely during execution on Python 3.

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5623?focusedWorklogId=153403&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153403
 ]

ASF GitHub Bot logged work on BEAM-5623:


Author: ASF GitHub Bot
Created on: 11/Oct/18 04:12
Start Date: 11/Oct/18 04:12
Worklog Time Spent: 10m 
  Work Description: tvalentyn edited a comment on issue #6648: [BEAM-5623] 
Skip tests that halt test suite execution on Python 3
URL: https://github.com/apache/beam/pull/6648#issuecomment-428773705
 
 
   cc: @aaltay @Fematich @Juta @manuzhang @splovyt


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153403)
Time Spent: 40m  (was: 0.5h)

> Several IO tests hang indefinitely during execution on Python 3.
> 
>
> Key: BEAM-5623
> URL: https://issues.apache.org/jira/browse/BEAM-5623
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> test_read_empty_single_file_no_eol_gzip 
> (apache_beam.io.textio_test.TextSourceTest) 
> Also several tests cases in tfrecordio_test, for example:
> test_process_auto (apache_beam.io.tfrecordio_test.TestReadAllFromTFRecord)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153402&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153402
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 04:10
Start Date: 11/Oct/18 04:10
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3.
URL: https://github.com/apache/beam/pull/6628#issuecomment-428814157
 
 
   Thanks for review & merge, @manuzhang!


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153402)
Time Spent: 5h 20m  (was: 5h 10m)

> Several IO tests fail in Python 3 with RuntimeError('dictionary changed size 
> during iteration',)}
> -
>
> Key: BEAM-5626
> URL: https://issues.apache.org/jira/browse/BEAM-5626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Ruoyun Huang
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 5h 20m
>  Remaining Estimate: 0h
>
>  ERROR: test_delete_dir 
> (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest)
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py",
>  line 506, in test_delete_dir
>  self.fs.delete([url_t1])
>File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py",
>  line 370, in delete
>  raise BeamIOError("Delete operation failed", exceptions)
>  apache_beam.io.filesystem.BeamIOError: Delete operation failed with 
> exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size 
> during iteration',   )}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-1251) Python 3 Support

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-1251?focusedWorklogId=153401&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153401
 ]

ASF GitHub Bot logged work on BEAM-1251:


Author: ASF GitHub Bot
Created on: 11/Oct/18 04:06
Start Date: 11/Oct/18 04:06
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #6650: [BEAM-1251] Make it 
possible to unskip Py3 tests by setting an environment variable.
URL: https://github.com/apache/beam/pull/6650#issuecomment-428813596
 
 
   cc: @manuzhang @Juta @Fematich @splovyt  


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153401)
Time Spent: 21h 40m  (was: 21.5h)

> Python 3 Support
> 
>
> Key: BEAM-1251
> URL: https://issues.apache.org/jira/browse/BEAM-1251
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-py-core
>Reporter: Eyad Sibai
>Assignee: Robbe
>Priority: Major
>  Time Spent: 21h 40m
>  Remaining Estimate: 0h
>
> I have been trying to use google datalab with python3. As I see there are 
> several packages that does not support python3 yet which google datalab 
> depends on. This is one of them.
> https://github.com/GoogleCloudPlatform/DataflowPythonSDK/issues/6



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5156) Apache Beam on dataflow runner can't find Tensorflow for workers

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-5156:
-

Assignee: Ankur Goenka  (was: Robert Bradshaw)

> Apache Beam on dataflow runner can't find Tensorflow for workers
> 
>
> Key: BEAM-5156
> URL: https://issues.apache.org/jira/browse/BEAM-5156
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
> Environment: google cloud compute instance running linux
>Reporter: Robert Bradshaw
>Assignee: Ankur Goenka
>Priority: Major
> Fix For: 2.5.0, 2.6.0
>
>
> Adding serialized tensorflow model to apache beam pipeline with python sdk 
> but it can not find any version of tensorflow when applied to dataflow runner 
> although it is not a problem locally. Tried various versions of tensorflow 
> from 1.6 to 1.10. I thought it might be a conflicting package some where so I 
> removed all other packages and tried to just install tensorflow and same 
> problem.
> Could not find a version that satisfies the requirement tensorflow==1.6.0 
> (from -r reqtest.txt (line 59)) (from versions: )No matching distribution 
> found for tensorflow==1.6.0 (from -r reqtest.txt (line 59))



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5156) Apache Beam on dataflow runner can't find Tensorflow for workers

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-5156:
--
Reporter: Thomas Johns  (was: Robert Bradshaw)

> Apache Beam on dataflow runner can't find Tensorflow for workers
> 
>
> Key: BEAM-5156
> URL: https://issues.apache.org/jira/browse/BEAM-5156
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
> Environment: google cloud compute instance running linux
>Reporter: Thomas Johns
>Assignee: Ankur Goenka
>Priority: Major
> Fix For: 2.5.0, 2.6.0
>
>
> Adding serialized tensorflow model to apache beam pipeline with python sdk 
> but it can not find any version of tensorflow when applied to dataflow runner 
> although it is not a problem locally. Tried various versions of tensorflow 
> from 1.6 to 1.10. I thought it might be a conflicting package some where so I 
> removed all other packages and tried to just install tensorflow and same 
> problem.
> Could not find a version that satisfies the requirement tensorflow==1.6.0 
> (from -r reqtest.txt (line 59)) (from versions: )No matching distribution 
> found for tensorflow==1.6.0 (from -r reqtest.txt (line 59))



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5156) Apache Beam on dataflow runner can't find Tensorflow for workers

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-5156:
-

Assignee: Robert Bradshaw  (was: Kenneth Knowles)

> Apache Beam on dataflow runner can't find Tensorflow for workers
> 
>
> Key: BEAM-5156
> URL: https://issues.apache.org/jira/browse/BEAM-5156
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
> Environment: google cloud compute instance running linux
>Reporter: Robert Bradshaw
>Assignee: Robert Bradshaw
>Priority: Major
> Fix For: 2.5.0, 2.6.0
>
>
> Adding serialized tensorflow model to apache beam pipeline with python sdk 
> but it can not find any version of tensorflow when applied to dataflow runner 
> although it is not a problem locally. Tried various versions of tensorflow 
> from 1.6 to 1.10. I thought it might be a conflicting package some where so I 
> removed all other packages and tried to just install tensorflow and same 
> problem.
> Could not find a version that satisfies the requirement tensorflow==1.6.0 
> (from -r reqtest.txt (line 59)) (from versions: )No matching distribution 
> found for tensorflow==1.6.0 (from -r reqtest.txt (line 59))



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5162) Document Metrics API for users

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-5162:
--
Priority: Major  (was: Minor)

> Document Metrics API for users
> --
>
> Key: BEAM-5162
> URL: https://issues.apache.org/jira/browse/BEAM-5162
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core, website
>Reporter: Maximilian Michels
>Assignee: Kenneth Knowles
>Priority: Major
>
> The Metrics API is currently only documented in Beam's JavaDocs. A 
> complementary user documentation with examples would be desirable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5336) Support for a Dask runner

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5336?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-5336:
-

Assignee: (was: Kenneth Knowles)

> Support for a Dask runner
> -
>
> Key: BEAM-5336
> URL: https://issues.apache.org/jira/browse/BEAM-5336
> Project: Beam
>  Issue Type: Wish
>  Components: runner-ideas
> Environment: Python
>Reporter: Georvic Tur
>Priority: Trivial
>
> Adding support for a Dask runner is currently under consideration?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-5336) Support for a Dask runner

2018-10-10 Thread Kenneth Knowles (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645941#comment-16645941
 ] 

Kenneth Knowles commented on BEAM-5336:
---

Contributions welcome :)

> Support for a Dask runner
> -
>
> Key: BEAM-5336
> URL: https://issues.apache.org/jira/browse/BEAM-5336
> Project: Beam
>  Issue Type: Wish
>  Components: runner-ideas
> Environment: Python
>Reporter: Georvic Tur
>Assignee: Kenneth Knowles
>Priority: Trivial
>
> Adding support for a Dask runner is currently under consideration?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-5162) Document Metrics API for users

2018-10-10 Thread Kenneth Knowles (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645940#comment-16645940
 ] 

Kenneth Knowles commented on BEAM-5162:
---

I'm not sure of a good owner, but I bumped the priority.

> Document Metrics API for users
> --
>
> Key: BEAM-5162
> URL: https://issues.apache.org/jira/browse/BEAM-5162
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core, website
>Reporter: Maximilian Michels
>Priority: Major
>
> The Metrics API is currently only documented in Beam's JavaDocs. A 
> complementary user documentation with examples would be desirable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5162) Document Metrics API for users

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-5162:
-

Assignee: (was: Kenneth Knowles)

> Document Metrics API for users
> --
>
> Key: BEAM-5162
> URL: https://issues.apache.org/jira/browse/BEAM-5162
> Project: Beam
>  Issue Type: Task
>  Components: sdk-java-core, website
>Reporter: Maximilian Michels
>Priority: Major
>
> The Metrics API is currently only documented in Beam's JavaDocs. A 
> complementary user documentation with examples would be desirable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5098) Combine.Globally::asSingletonView clears side inputs

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-5098:
--
Priority: Critical  (was: Major)

> Combine.Globally::asSingletonView clears side inputs
> 
>
> Key: BEAM-5098
> URL: https://issues.apache.org/jira/browse/BEAM-5098
> Project: Beam
>  Issue Type: Bug
>  Components: beam-model
>Affects Versions: 2.5.0
>Reporter: Mike Pedersen
>Assignee: Kenneth Knowles
>Priority: Critical
>
> It seems like calling .asSingletonView on Combine.Globally clears all side 
> inputs. Take this code for example:
>  
> {code:java}
> public class Main {
> public static void main(String[] args) {
>     PipelineOptions options = PipelineOptionsFactory.create();
>     Pipeline p = Pipeline.create(options);
>     PCollection a = p.apply(Create.of(1, 2, 3));
>     PCollectionView b = 
> p.apply(Create.of(10)).apply(View.asSingleton());
>     a
>     .apply(Combine.globally(new 
> CombineWithContext.CombineFnWithContext() {
>     @Override
>     public Integer 
> createAccumulator(CombineWithContext.Context c) {
>     return c.sideInput(b);
>     }
>     @Override
>     public Integer addInput(Integer accumulator, Integer 
> input, CombineWithContext.Context c) {
>     return accumulator + input;
>     }
>     @Override
>     public Integer mergeAccumulators(Iterable 
> accumulators, CombineWithContext.Context c) {
>     int sum = 0;
>     for (int i : accumulators) {
>     sum += i;
>     }
>     return sum;
>     }
>     @Override
>     public Integer extractOutput(Integer accumulator, 
> CombineWithContext.Context c) {
>     return accumulator;
>     }
>     @Override
>     public Integer defaultValue() {
>     return 0;
>     }
>     }).withSideInputs(b).asSingletonView());
>     p.run().waitUntilFinish();
>     }
> }{code}
> This fails with the following exception:
> {code:java}
> Exception in thread "main" 
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
> java.lang.IllegalArgumentException: calling sideInput() with unknown view
>     at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:349)
>     at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:319)
>     at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:210)
>     at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:66)
>     at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
>     at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
>     at Main.main(Main.java:287)
> Caused by: java.lang.IllegalArgumentException: calling sideInput() with 
> unknown view
>     at 
> org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.sideInput(SimpleDoFnRunner.java:212)
>     at 
> org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:69)
>     at 
> org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner$DoFnProcessContext.sideInput(SimpleDoFnRunner.java:489)
>     at 
> org.apache.beam.sdk.transforms.Combine$GroupedValues$1$1.sideInput(Combine.java:2137)
>     at Main$1.createAccumulator(Main.java:258)
>     at Main$1.createAccumulator(Main.java:255)
>     at 
> org.apache.beam.sdk.transforms.CombineWithContext$CombineFnWithContext.apply(CombineWithContext.java:120)
>     at 
> org.apache.beam.sdk.transforms.Combine$GroupedValues$1.processElement(Combine.java:2129){code}
> But if you change
> {code:java}
> .withSideInputs(b).asSingletonView()){code}
> to
> {code:java}
> .withSideInputs(b)).apply(View.asSingleton()){code}
> then it works just fine.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5156) Apache Beam on dataflow runner can't find Tensorflow for workers

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-5156:
--
Reporter: Robert Bradshaw  (was: Thomas Johns)

> Apache Beam on dataflow runner can't find Tensorflow for workers
> 
>
> Key: BEAM-5156
> URL: https://issues.apache.org/jira/browse/BEAM-5156
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
> Environment: google cloud compute instance running linux
>Reporter: Robert Bradshaw
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.5.0, 2.6.0
>
>
> Adding serialized tensorflow model to apache beam pipeline with python sdk 
> but it can not find any version of tensorflow when applied to dataflow runner 
> although it is not a problem locally. Tried various versions of tensorflow 
> from 1.6 to 1.10. I thought it might be a conflicting package some where so I 
> removed all other packages and tried to just install tensorflow and same 
> problem.
> Could not find a version that satisfies the requirement tensorflow==1.6.0 
> (from -r reqtest.txt (line 59)) (from versions: )No matching distribution 
> found for tensorflow==1.6.0 (from -r reqtest.txt (line 59))



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5156) Apache Beam on dataflow runner can't find Tensorflow for workers

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5156?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-5156:
--
Component/s: (was: beam-model)
 sdk-py-core

> Apache Beam on dataflow runner can't find Tensorflow for workers
> 
>
> Key: BEAM-5156
> URL: https://issues.apache.org/jira/browse/BEAM-5156
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-py-core
> Environment: google cloud compute instance running linux
>Reporter: Robert Bradshaw
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: 2.5.0, 2.6.0
>
>
> Adding serialized tensorflow model to apache beam pipeline with python sdk 
> but it can not find any version of tensorflow when applied to dataflow runner 
> although it is not a problem locally. Tried various versions of tensorflow 
> from 1.6 to 1.10. I thought it might be a conflicting package some where so I 
> removed all other packages and tried to just install tensorflow and same 
> problem.
> Could not find a version that satisfies the requirement tensorflow==1.6.0 
> (from -r reqtest.txt (line 59)) (from versions: )No matching distribution 
> found for tensorflow==1.6.0 (from -r reqtest.txt (line 59))



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (BEAM-4845) Nexmark suites do not compile

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-4845?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles resolved BEAM-4845.
---
   Resolution: Fixed
Fix Version/s: Not applicable

> Nexmark suites do not compile
> -
>
> Key: BEAM-4845
> URL: https://issues.apache.org/jira/browse/BEAM-4845
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark
>Reporter: Lukasz Gajowy
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: Not applicable
>
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> This is due to this PR: [https://github.com/apache/beam/pull/5341.] Some 
> interfaces/classes had their visibility changed and this caused nexmark to 
> fail at compilation phase. 
> [https://builds.apache.org/view/A-D/view/Beam/job/beam_PostCommit_Java_Nexmark_Direct/112/console]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (BEAM-2567) Port triggers design doc to a contributor technical reference

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-2567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles closed BEAM-2567.
-
   Resolution: Won't Fix
Fix Version/s: Not applicable

Sink triggers are the way to go - no value is proselytizing this.

> Port triggers design doc to a contributor technical reference
> -
>
> Key: BEAM-2567
> URL: https://issues.apache.org/jira/browse/BEAM-2567
> Project: Beam
>  Issue Type: Improvement
>  Components: website
>Reporter: Kenneth Knowles
>Assignee: Kenneth Knowles
>Priority: Major
> Fix For: Not applicable
>
>
> There is a fairly old doc at https://s.apache.org/beam-triggers that could be 
> a useful reference doc for contributors. Since we don't catalog these docs 
> anywhere, it should be surfaced in a useful form.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-3746) Count.globally should override getIncompatibleGlobalWindowErrorMessage to tell the user the usage that is currently only in javadoc

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-3746?focusedWorklogId=153380&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153380
 ]

ASF GitHub Bot logged work on BEAM-3746:


Author: ASF GitHub Bot
Created on: 11/Oct/18 03:46
Start Date: 11/Oct/18 03:46
Worklog Time Spent: 10m 
  Work Description: kennknowles commented on issue #6632: [BEAM-3746] 
Change incompatible message from referencing the output collection to 
referencing the input collection
URL: https://github.com/apache/beam/pull/6632#issuecomment-428810727
 
 
   run java precommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153380)
Time Spent: 2h 50m  (was: 2h 40m)

> Count.globally should override getIncompatibleGlobalWindowErrorMessage to 
> tell the user the usage that is currently only in javadoc
> ---
>
> Key: BEAM-3746
> URL: https://issues.apache.org/jira/browse/BEAM-3746
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Sam Rohde
>Priority: Major
>  Labels: beginner, newbie, starter
>  Time Spent: 2h 50m
>  Remaining Estimate: 0h
>
> https://beam.apache.org/documentation/sdks/javadoc/2.3.0/org/apache/beam/sdk/transforms/Count.html#globally--
> "Note: if the input collection uses a windowing strategy other than 
> GlobalWindows, use Combine.globally(Count.combineFn()).withoutDefaults() 
> instead."
> But the actual crash a user gets is:
> "java.lang.IllegalStateException: Default values are not supported in 
> Combine.globally() if the output PCollection is not windowed by 
> GlobalWindows. Instead, use Combine.globally().withoutDefaults() to output an 
> empty PCollection if the input PCollection is empty, or 
> Combine.globally().asSingletonView() to get the default output of the 
> CombineFn if the input PCollection is empty."
> There is a method that exists solely to make this actually useful, so we 
> should use it!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #1799

2018-10-10 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-5176) FailOnWarnings behave differently between CLI and Intellij build

2018-10-10 Thread Kenneth Knowles (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645909#comment-16645909
 ] 

Kenneth Knowles commented on BEAM-5176:
---

This is blocking me today so I will take it for a bit and report if I make any 
progress or if I have to give up.

> FailOnWarnings behave differently between CLI and Intellij build 
> -
>
> Key: BEAM-5176
> URL: https://issues.apache.org/jira/browse/BEAM-5176
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system
>Reporter: Etienne Chauchot
>Assignee: Kenneth Knowles
>Priority: Major
>
>  In command line the build passes but fails on the IDE because of warnings. 
> To make it pass I had to put false in failOnWarnings in ApplyJavaNature



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5176) FailOnWarnings behave differently between CLI and Intellij build

2018-10-10 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles reassigned BEAM-5176:
-

Assignee: Kenneth Knowles  (was: Luke Cwik)

> FailOnWarnings behave differently between CLI and Intellij build 
> -
>
> Key: BEAM-5176
> URL: https://issues.apache.org/jira/browse/BEAM-5176
> Project: Beam
>  Issue Type: Sub-task
>  Components: build-system
>Reporter: Etienne Chauchot
>Assignee: Kenneth Knowles
>Priority: Major
>
>  In command line the build passes but fails on the IDE because of warnings. 
> To make it pass I had to put false in failOnWarnings in ApplyJavaNature



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-5714) RedisIO emit error of EXEC without MULTI

2018-10-10 Thread K.K. POON (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5714?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

K.K. POON updated BEAM-5714:

Description: 
RedisIO has EXEC without MULTI error after set a batch of records.

 

By looking at the source code, I guess there is missing `pipeline.multi();` 
after exec() the last batch.

[https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555]

  was:
RedisIO has EXEC without MULTI error after set a batch of records.

 

After investigate the source code, I guess there is missing `pipeline.multi();` 
after exec() the last batch.

https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555


> RedisIO emit error of EXEC without MULTI
> 
>
> Key: BEAM-5714
> URL: https://issues.apache.org/jira/browse/BEAM-5714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-redis
>Affects Versions: 2.7.0
>Reporter: K.K. POON
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> RedisIO has EXEC without MULTI error after set a batch of records.
>  
> By looking at the source code, I guess there is missing `pipeline.multi();` 
> after exec() the last batch.
> [https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1337

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

--
Started by GitHub push by manuzhang
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9084dbb0c01fc242966a3f87465c69db72ca62d3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9084dbb0c01fc242966a3f87465c69db72ca62d3
Commit message: "Merge pull request #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3."
 > git rev-list --no-walk a8a495f0b0927de5b133e1f8eaa865b8f22a3d34 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
--info --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :beam-sdks-python:validatesRunnerBatchTests 
:beam-sdks-python:validatesRunnerStreamingTests
Initialized native services in: /home/jenkins/.gradle/native
To honour the JVM settings for this build a new JVM will be forked. Please 
consider using the daemon: 
https://docs.gradle.org/4.10.2/userguide/gradle_daemon.html.
Starting process 'Gradle build daemon'. Working directory: 
/home/jenkins/.gradle/daemon/4.10.2 Command: 
/usr/local/asfpackages/java/jdk1.8.0_172/bin/java -Xmx4g -Dfile.encoding=UTF-8 
-Duser.country=US -Duser.language=en -Duser.variant -cp 
/home/jenkins/.gradle/wrapper/dists/gradle-4.10.2-bin/cghg6c4gf4vkiutgsab8yrnwv/gradle-4.10.2/lib/gradle-launcher-4.10.2.jar
 org.gradle.launcher.daemon.bootstrap.GradleDaemon 4.10.2
An attempt to start the daemon took 0.02 secs.

FAILURE: Build failed with an exception.

* What went wrong:
unable to create new native thread

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #323

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] Unskip hadoopfilesystem_test which is already passing in Python 3.

--
Started by GitHub push by manuzhang
[EnvInject] - Loading node environment variables.
Building remotely on beam15 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9084dbb0c01fc242966a3f87465c69db72ca62d3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9084dbb0c01fc242966a3f87465c69db72ca62d3
Commit message: "Merge pull request #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3."
 > git rev-list --no-walk a8a495f0b0927de5b133e1f8eaa865b8f22a3d34 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ 
 
--info --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g :beam-sdks-python:flinkCompatibilityMatrixBatch 
:beam-sdks-python:flinkCompatibilityMatrixStreaming
Initialized native services in: /home/jenkins/.gradle/native
To honour the JVM settings for this build a new JVM will be forked. Please 
consider using the daemon: 
https://docs.gradle.org/4.10.2/userguide/gradle_daemon.html.
Starting process 'Gradle build daemon'. Working directory: 
/home/jenkins/.gradle/daemon/4.10.2 Command: 
/usr/local/asfpackages/java/jdk1.8.0_172/bin/java -Xmx4g -Dfile.encoding=UTF-8 
-Duser.country=US -Duser.language=en -Duser.variant -cp 
/home/jenkins/.gradle/wrapper/dists/gradle-4.10.2-bin/cghg6c4gf4vkiutgsab8yrnwv/gradle-4.10.2/lib/gradle-launcher-4.10.2.jar
 org.gradle.launcher.daemon.bootstrap.GradleDaemon 4.10.2
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.771 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at 
https://docs.gradle.org/4.10.2/userguide/gradle_daemon.html
Please read the following process output to find out more:
---


* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153376&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153376
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 03:23
Start Date: 11/Oct/18 03:23
Worklog Time Spent: 10m 
  Work Description: manuzhang closed pull request #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3.
URL: https://github.com/apache/beam/pull/6628
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/io/hadoopfilesystem_test.py 
b/sdks/python/apache_beam/io/hadoopfilesystem_test.py
index 3dd94b7ed8c..ae2b810f5fc 100644
--- a/sdks/python/apache_beam/io/hadoopfilesystem_test.py
+++ b/sdks/python/apache_beam/io/hadoopfilesystem_test.py
@@ -322,10 +322,6 @@ def test_create_success(self):
 expected_file = FakeFile(url, 'wb')
 self.assertEqual(self._fake_hdfs.files[url], expected_file)
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_create_write_read_compressed(self):
 url = self.fs.join(self.tmpdir, 'new_file.gz')
 
@@ -363,10 +359,6 @@ def _cmpfiles(self, url1, url2):
 data2 = f2.read()
 return data1 == data2
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_copy_file(self):
 url1 = self.fs.join(self.tmpdir, 'new_file1')
 url2 = self.fs.join(self.tmpdir, 'new_file2')
@@ -377,10 +369,6 @@ def test_copy_file(self):
 self.assertTrue(self._cmpfiles(url1, url2))
 self.assertTrue(self._cmpfiles(url1, url3))
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_copy_file_overwrite_error(self):
 url1 = self.fs.join(self.tmpdir, 'new_file1')
 url2 = self.fs.join(self.tmpdir, 'new_file2')
@@ -392,10 +380,6 @@ def test_copy_file_overwrite_error(self):
 BeamIOError, r'already exists.*%s' % posixpath.basename(url2)):
   self.fs.copy([url1], [url2])
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_copy_file_error(self):
 url1 = self.fs.join(self.tmpdir, 'new_file1')
 url2 = self.fs.join(self.tmpdir, 'new_file2')
@@ -409,10 +393,6 @@ def test_copy_file_error(self):
   self.fs.copy([url1, url3], [url2, url4])
 self.assertTrue(self._cmpfiles(url3, url4))
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_copy_directory(self):
 url_t1 = self.fs.join(self.tmpdir, 't1')
 url_t1_inner = self.fs.join(self.tmpdir, 't1/inner')
@@ -430,10 +410,6 @@ def test_copy_directory(self):
 self.fs.copy([url_t1], [url_t2])
 self.assertTrue(self._cmpfiles(url1, url2))
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_copy_directory_overwrite_error(self):
 url_t1 = self.fs.join(self.tmpdir, 't1')
 url_t1_inner = self.fs.join(self.tmpdir, 't1/inner')
@@ -458,10 +434,6 @@ def test_copy_directory_overwrite_error(self):
 with self.assertRaisesRegexp(BeamIOError, r'already exists'):
   self.fs.copy([url_t1], [url_t2])
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still needs to be fixed on Python 3'
-   'TODO: BEAM-5627')
   def test_rename_file(self):
 url1 = self.fs.join(self.tmpdir, 'f1')
 url2 = self.fs.join(self.tmpdir, 'f2')
@@ -490,10 +462,6 @@ def test_rename_file_error(self):
 self.assertFalse(self.fs.exists(url3))
 self.assertTrue(self.fs.exists(url4))
 
-  @unittest.skipIf(sys.version_info[0] == 3 and
-   os.environ.get('RUN_SKIPPED_PY3_TESTS') != '1',
-   'This test still 

[beam] 01/01: Merge pull request #6628: [BEAM-5626] Unskip hadoopfilesystem_test which is already passing in Python 3.

2018-10-10 Thread mauzhang
This is an automated email from the ASF dual-hosted git repository.

mauzhang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 9084dbb0c01fc242966a3f87465c69db72ca62d3
Merge: a8a495f ea2276b
Author: Manu Zhang 
AuthorDate: Thu Oct 11 11:23:36 2018 +0800

Merge pull request #6628: [BEAM-5626] Unskip hadoopfilesystem_test which is 
already passing in Python 3.

 .../python/apache_beam/io/hadoopfilesystem_test.py | 40 --
 sdks/python/tox.ini|  2 +-
 2 files changed, 1 insertion(+), 41 deletions(-)



[beam] branch master updated (a8a495f -> 9084dbb)

2018-10-10 Thread mauzhang
This is an automated email from the ASF dual-hosted git repository.

mauzhang pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from a8a495f  Merge pull request #6606: Add additional code owners for 
runners-core
 add ea2276b  Unskip hadoopfilesystem_test which is already passing in 
Python 3.
 new 9084dbb  Merge pull request #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../python/apache_beam/io/hadoopfilesystem_test.py | 40 --
 sdks/python/tox.ini|  2 +-
 2 files changed, 1 insertion(+), 41 deletions(-)



[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153375&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153375
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 03:22
Start Date: 11/Oct/18 03:22
Worklog Time Spent: 10m 
  Work Description: manuzhang commented on issue #6628: [BEAM-5626] Unskip 
hadoopfilesystem_test which is already passing in Python 3.
URL: https://github.com/apache/beam/pull/6628#issuecomment-428807062
 
 
   LGTM


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153375)
Time Spent: 5h  (was: 4h 50m)

> Several IO tests fail in Python 3 with RuntimeError('dictionary changed size 
> during iteration',)}
> -
>
> Key: BEAM-5626
> URL: https://issues.apache.org/jira/browse/BEAM-5626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Ruoyun Huang
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 5h
>  Remaining Estimate: 0h
>
>  ERROR: test_delete_dir 
> (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest)
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py",
>  line 506, in test_delete_dir
>  self.fs.delete([url_t1])
>File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py",
>  line 370, in delete
>  raise BeamIOError("Delete operation failed", exceptions)
>  apache_beam.io.filesystem.BeamIOError: Delete operation failed with 
> exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size 
> during iteration',   )}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1336

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

--
[...truncated 15.01 MB...]
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1378)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl.execute(ManagedExecutorImpl.java:38)
at 
org.gradle.launcher.daemon.server.DefaultDaemonConnection.(DefaultDaemonConnection.java:57)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.receiveAndHandleCommand(DefaultIncomingConnectionHandler.java:129)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.run(DefaultIncomingConnectionHandler.java:121)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Failed to execute 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker@5c0a1cd3.
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1378)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl.execute(ManagedExecutorImpl.java:38)
at 
org.gradle.launcher.daemon.server.DefaultDaemonConnection.(DefaultDaemonConnection.java:57)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.receiveAndHandleCommand(DefaultIncomingConnectionHandler.java:129)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.run(DefaultIncomingConnectionHandler.java:121)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Failed to execute 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker@d2799e.
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at 
java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
at 
java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1378)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl.execute(ManagedExecutorImpl.java:38)
at 
org.gradle.launcher.daemon.server.DefaultDaemonConnection.(DefaultDaemonConnection.java:57)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.receiveAndHandleCommand(DefaultIncomingConnectionHandler.java:129)
at 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker.run(DefaultIncomingConnectionHandler.java:121)
at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Failed to execute 
org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler$ConnectionWorker@47578a30.
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at 
java

[jira] [Commented] (BEAM-5714) RedisIO emit error of EXEC without MULTI

2018-10-10 Thread K.K. POON (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645888#comment-16645888
 ] 

K.K. POON commented on BEAM-5714:
-

Pull Request on github has been initiated

https://github.com/apache/beam/pull/6651

> RedisIO emit error of EXEC without MULTI
> 
>
> Key: BEAM-5714
> URL: https://issues.apache.org/jira/browse/BEAM-5714
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-redis
>Affects Versions: 2.7.0
>Reporter: K.K. POON
>Assignee: Jean-Baptiste Onofré
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> RedisIO has EXEC without MULTI error after set a batch of records.
>  
> After investigate the source code, I guess there is missing 
> `pipeline.multi();` after exec() the last batch.
> https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5714) RedisIO emit error of EXEC without MULTI

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5714?focusedWorklogId=153374&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153374
 ]

ASF GitHub Bot logged work on BEAM-5714:


Author: ASF GitHub Bot
Created on: 11/Oct/18 03:06
Start Date: 11/Oct/18 03:06
Worklog Time Spent: 10m 
  Work Description: kkpoon opened a new pull request #6651: [BEAM-5714] Fix 
RedisIO EXEC without MULTI error
URL: https://github.com/apache/beam/pull/6651
 
 
   ## Problem
   
   RedisIO has EXEC without MULTI error after set a batch of records.
   
   After investigate the source code, I guess there is missing 
`pipeline.multi();` after exec() the last batch.
   
   
https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555
   
   ## Solution 
   
   - re-initiate the MULTI after batch EXEC
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [x] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [x] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   It will help us expedite review of your Pull Request if you tag someone 
(e.g. `@username`) to look at it.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/)
 | --- | --- | --- | --- | --- | ---
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/)
 | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)
  [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/)
 | --- | --- | ---
   
   
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153374)
Time Spent: 10m
Remaining Es

[jira] [Created] (BEAM-5714) RedisIO emit error of EXEC without MULTI

2018-10-10 Thread K.K. POON (JIRA)
K.K. POON created BEAM-5714:
---

 Summary: RedisIO emit error of EXEC without MULTI
 Key: BEAM-5714
 URL: https://issues.apache.org/jira/browse/BEAM-5714
 Project: Beam
  Issue Type: Bug
  Components: io-java-redis
Affects Versions: 2.7.0
Reporter: K.K. POON
Assignee: Jean-Baptiste Onofré


RedisIO has EXEC without MULTI error after set a batch of records.

 

After investigate the source code, I guess there is missing `pipeline.multi();` 
after exec() the last batch.

https://github.com/apache/beam/blob/master/sdks/java/io/redis/src/main/java/org/apache/beam/sdk/io/redis/RedisIO.java#L555



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #322

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

--
[...truncated 51.13 MB...]
[ToKeyedWorkItem (10/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (10/16) 
(8c3dd22d9f469508ac976656daf41b5d) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
75b45ed08de338e0341999007e951475.
[ToKeyedWorkItem (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (5/16) 
(6e70afb5eda5d053aa39a45c26679372).
[ToKeyedWorkItem (4/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (4/16) (34fba3dfebc073b3ca0128ff8ba7d0d7) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (4/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (4/16) 
(34fba3dfebc073b3ca0128ff8ba7d0d7).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (3/16) 
(35a84920a090c971bfe50fcfa1117285) switched from RUNNING to FINISHED.
[ToKeyedWorkItem (5/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (5/16) 
(6e70afb5eda5d053aa39a45c26679372) [FINISHED]
[ToKeyedWorkItem (4/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (4/16) 
(34fba3dfebc073b3ca0128ff8ba7d0d7) [FINISHED]
[ToKeyedWorkItem (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (3/16) 
(5f85b7b64c994dd45782cab491ddeafe).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
3db19811e98aadde41613551f1effe39.
[ToKeyedWorkItem (3/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (3/16) 
(5f85b7b64c994dd45782cab491ddeafe) [FINISHED]
[ToKeyedWorkItem (6/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (6/16) (e352d4b6da496f27ddd27ec4c270697e) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (6/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (6/16) 
(e352d4b6da496f27ddd27ec4c270697e).
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (16/16) (d52febe5727bfbac2db8cbe8d5ec6633) switched from 
RUNNING to FINISHED.
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (16/16) 
(d52febe5727bfbac2db8cbe8d5ec6633).
[ToKeyedWorkItem (6/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (6/16) 
(e352d4b6da496f27ddd27ec4c270697e) [FINISHED]
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (16/16) 
(d52febe5727bfbac2db8cbe8d5ec6633) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
1d7d281b15dbe24a3f77d7bd76b0d913.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (13/16) 
(c129c0f7ee55c284b2f45a6649693b5f) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
fb501216e8491ee6049a088f8bab17bf.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
5caadd3133d4b693bf21c68bdc815c19.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (6/16) 
(75b45ed08de338e0341999007e951475) switched from RUNNING to

[jira] [Work logged] (BEAM-1251) Python 3 Support

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-1251?focusedWorklogId=153373&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153373
 ]

ASF GitHub Bot logged work on BEAM-1251:


Author: ASF GitHub Bot
Created on: 11/Oct/18 02:47
Start Date: 11/Oct/18 02:47
Worklog Time Spent: 10m 
  Work Description: tvalentyn opened a new pull request #6650: [BEAM-1251] 
Make it possible to unskip Py3 tests by setting an environment variable.
URL: https://github.com/apache/beam/pull/6650
 
 
   Make it possible to run tests that are skipped in Python 3 by setting an 
environment variable RUN_SKIPPED_PY3_TESTS=1. This will make it a little easier 
to unskip the tests for developer testing and to run entire test suite to see 
how many tests are still failing in Python 3.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   It will help us expedite review of your Pull Request if you tag someone 
(e.g. `@username`) to look at it.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/)
 | --- | --- | --- | --- | --- | ---
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/)
 | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)
  [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/)
 | --- | --- | ---
   
   
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153373)
Time Spent: 21.5h  (was: 21h 20m)

> Python 3 Support
> 
>
> Key: BEAM-1251
> URL:

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #1798

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] Add additional code owners for runners-core

--
[...truncated 76.83 KB...]
  No history is available.
Found 1 files
Processed src/test/avro/org/apache/beam/sdk/io/user.avsc
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
10,5,main]) completed. Took 0.491 secs.
:beam-sdks-java-core:compileTestJava (Thread[Task worker for ':' Thread 
10,5,main]) started.

> Task :beam-model-pipeline:shadowJar
***
GRADLE SHADOW STATS

Total Jars: 36 (includes project)
Total Time: 8.558s [8558ms]
Average Time/Jar: 0.23772s [237.72ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':',5,main]) completed. 
Took 10.264 secs.
:beam-model-fn-execution:shadowJar (Thread[Task worker for ':',5,main]) started.
:beam-model-job-management:shadowJar (Thread[Task worker for ':' Thread 
2,5,main]) started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
c612f2c1d3202ab66249627a733370c8
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
:542:
 warning: [EqualsGetClass] Overriding Object#equals in a non-final class by 
using getClass rather than instanceof breaks substitutability of subclasses.
  public boolean equals(Object o) {
 ^
(see https://errorprone.info/bugpattern/EqualsGetClass)
  Did you mean 'if (!(o instanceof GcsPath)) {'?
error: warnings found and -Werror specified
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 error
1 warning

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava FAILED
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 7,5,main]) completed. Took 6.822 secs.

> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava
file or directory 
'
 not found
Build cache key for task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava' is 
3f7b9dbdbb139ca8e9f20fc72303ae6c
Task ':beam-vendor-sdks-java-extensions-protobuf:compileJava' is not up-to-date 
because:
  No history is available.
Custom actions are attached to task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
file or directory 
'
 not found
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-core:compileTestJava FAILED
Build cache key for task ':beam-sdks-java-core:compileTestJava' is 
4695b5724c647efc2fc7bf0e53a69abb
Task ':beam-sdks-java-core:compileTestJava' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':beam-sdks-java-core:compileTestJava'.
All input files are considered out-of-date for incremental task 
':beam-sdks-java-core:compileTestJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
:105:
 warning: [EqualsGetClass] Overriding Object#equals in a non-final class by 
using getClass rather than instanceof breaks

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #1797

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[scott] [BEAM-5669] Ensure website publish doesn't create empty commits

--
[...truncated 76.74 KB...]
> Task :beam-sdks-java-core:generateTestAvroJava
Build cache key for task ':beam-sdks-java-core:generateTestAvroJava' is 
b42653d5972e3f3a3de1d00bdbbe4151
Caching disabled for task ':beam-sdks-java-core:generateTestAvroJava': Caching 
has not been enabled for the task
Task ':beam-sdks-java-core:generateTestAvroJava' is not up-to-date because:
  No history is available.
Found 1 files
Processed src/test/avro/org/apache/beam/sdk/io/user.avsc
:beam-sdks-java-core:generateTestAvroJava (Thread[Task worker for ':' Thread 
5,5,main]) completed. Took 0.54 secs.
:beam-sdks-java-core:compileTestJava (Thread[Task worker for ':' Thread 
5,5,main]) started.

> Task :beam-model-pipeline:shadowJar
***
GRADLE SHADOW STATS

Total Jars: 36 (includes project)
Total Time: 8.28s [8280ms]
Average Time/Jar: 0.23s [230.0ms]
***
:beam-model-pipeline:shadowJar (Thread[Task worker for ':' Thread 6,5,main]) 
completed. Took 9.948 secs.
:beam-model-fn-execution:shadowJar (Thread[Task worker for ':' Thread 
6,5,main]) started.
:beam-model-job-management:shadowJar (Thread[Task worker for ':',5,main]) 
started.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Build cache key for task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is 
bd9df9dc0e4104efd213806513bedb5c
Task ':beam-sdks-java-extensions-google-cloud-platform-core:compileJava' is not 
up-to-date because:
  No history is available.
Custom actions are attached to task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-sdks-java-extensions-google-cloud-platform-core:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with error-prone compiler
:542:
 warning: [EqualsGetClass] Overriding Object#equals in a non-final class by 
using getClass rather than instanceof breaks substitutability of subclasses.
  public boolean equals(Object o) {
 ^
(see https://errorprone.info/bugpattern/EqualsGetClass)
  Did you mean 'if (!(o instanceof GcsPath)) {'?
error: warnings found and -Werror specified
Note: 

 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 error
1 warning

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava FAILED
:beam-sdks-java-extensions-google-cloud-platform-core:compileJava (Thread[Task 
worker for ':' Thread 8,5,main]) completed. Took 6.514 secs.

> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava
file or directory 
'
 not found
Build cache key for task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava' is 
e21aec04fc24ed9506bf2c0493536c7b
Task ':beam-vendor-sdks-java-extensions-protobuf:compileJava' is not up-to-date 
because:
  No history is available.
Custom actions are attached to task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
All input files are considered out-of-date for incremental task 
':beam-vendor-sdks-java-extensions-protobuf:compileJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
file or directory 
'
 not found
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Created classpath snapshot for incremental compilation in 2.75 secs. 39 
duplicate classes found in classpath (see all with --debug).
Packing task ':beam-vendor-sdks-java-extensions-protobuf:compileJava'

> Task :beam-sdks-java-core:compileTestJava FAILED
Build cache key for task ':beam-sdks-java-core:compileTestJava' is 
95ac557f7b485b12970700ac05b055d8
Task ':beam-sdks-java-core:compileTestJava' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':beam-sdks-java-core:compileTestJava'.
All input files are considered out-of

[beam] 01/01: Merge pull request #6606: Add additional code owners for runners-core

2018-10-10 Thread scott
This is an automated email from the ASF dual-hosted git repository.

scott pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit a8a495f0b0927de5b133e1f8eaa865b8f22a3d34
Merge: 2733aa9 bbc4dce
Author: Scott Wegner 
AuthorDate: Wed Oct 10 22:37:46 2018 -0400

Merge pull request #6606: Add additional code owners for runners-core

 runners/core-java/OWNERS | 2 ++
 1 file changed, 2 insertions(+)



[beam] branch master updated (2733aa9 -> a8a495f)

2018-10-10 Thread scott
This is an automated email from the ASF dual-hosted git repository.

scott pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 2733aa9  Merge pull request #6644: [BEAM-5669] Ensure website publish 
doesn't create empty commits
 add bbc4dce  Add additional code owners for runners-core
 new a8a495f  Merge pull request #6606: Add additional code owners for 
runners-core

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 runners/core-java/OWNERS | 2 ++
 1 file changed, 2 insertions(+)



[jira] [Work logged] (BEAM-5669) Empty commits pushed by PostCommit_Website_Publish jenkins job

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5669?focusedWorklogId=153368&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153368
 ]

ASF GitHub Bot logged work on BEAM-5669:


Author: ASF GitHub Bot
Created on: 11/Oct/18 02:37
Start Date: 11/Oct/18 02:37
Worklog Time Spent: 10m 
  Work Description: swegner closed pull request #6644: [BEAM-5669] Ensure 
website publish doesn't create empty commits
URL: https://github.com/apache/beam/pull/6644
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/website/build.gradle b/website/build.gradle
index cba2799ed9e..3c41c141ef1 100644
--- a/website/build.gradle
+++ b/website/build.gradle
@@ -209,7 +209,7 @@ task commitWebsite << {
 
   def currentDate = new Date().format('/MM/dd HH:mm:ss')
   String message = "Publishing website ${currentDate} at commit 
${latestCommit}"
-  if (git.status().isClean()) {
+  if (!git.status().staged.getAllChanges()) {
 println 'No changes to commit'
   } else {
 println 'Creating commit for changes'


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153368)
Time Spent: 1h  (was: 50m)

> Empty commits pushed by PostCommit_Website_Publish jenkins job
> --
>
> Key: BEAM-5669
> URL: https://issues.apache.org/jira/browse/BEAM-5669
> Project: Beam
>  Issue Type: Bug
>  Components: build-system
>Affects Versions: 2.8.0
>Reporter: Alan Myrvold
>Assignee: Scott Wegner
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> The isClean() call in 
> https://github.com/apache/beam/blob/67562393de09e1e8e24c4a83ca5274f57c8379bb/website/build.gradle#L165
>  is returning false due to unstaged .gradle and build files.
> The asf-site branch needs a .gitignore, which can be the same as the master



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[beam] branch master updated (4e0c9a1 -> 2733aa9)

2018-10-10 Thread scott
This is an automated email from the ASF dual-hosted git repository.

scott pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 4e0c9a1  Merge pull request #6645 from tvalentyn/exclude_flaky_tests
 add edb3412  [BEAM-5669] Ensure website publish doesn't create empty 
commits
 add 2733aa9  Merge pull request #6644: [BEAM-5669] Ensure website publish 
doesn't create empty commits

No new revisions were added by this update.

Summary of changes:
 website/build.gradle | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1237

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] [BEAM-5700] remove the extra licenses from python bigquery IT

[github] [BEAM-5681] Fix website tasks when pull-request ID is specified

--
[...truncated 18.64 MB...]
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create123/Read(CreateSource) as step s10
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding OutputSideInputs as step s11
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Window.Into()/Window.Assign as step 
s12
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) as step 
s13
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map 
as step s14
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign as step 
s15
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey as step 
s16
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map as 
step s17
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/RewindowActuals/Window.Assign as step 
s18
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map as step s19
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveActualsTriggering/Flatten.PCollections as step 
s20
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Create.Values/Read(CreateSource) as 
step s21
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign as step 
s22
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveDummyTriggering/Flatten.PCollections as step s23
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/FlattenDummyAndContents as step s24
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/NeverTrigger/Flatten.PCollections as 
step s25
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GroupDummyAndContents as step s26
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Values/Values/Map as step s27
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/ParDo(Concat) as step s28
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GetPane/Map as step s29
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/RunChecks as step s30
Oct 11, 2018 2:29:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/VerifyAssertions/ParDo(DefaultConclude) as step s31
Oct 11, 2018 2:29:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-1011022901-5602f35a/output/results/staging/
Oct 11, 2018 2:29:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <70577 bytes, hash 2xTMrYn-QRHxcVqHFaWRJg> to 
gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-1011022901-

[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153367&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153367
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 02:24
Start Date: 11/Oct/18 02:24
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #6628: [BEAM-5626] Run 
more tests in Python 3.
URL: https://github.com/apache/beam/pull/6628#issuecomment-428796399
 
 
   It makes more sense to split commits into their own PRs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153367)
Time Spent: 4h 50m  (was: 4h 40m)

> Several IO tests fail in Python 3 with RuntimeError('dictionary changed size 
> during iteration',)}
> -
>
> Key: BEAM-5626
> URL: https://issues.apache.org/jira/browse/BEAM-5626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Ruoyun Huang
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>
>  ERROR: test_delete_dir 
> (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest)
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py",
>  line 506, in test_delete_dir
>  self.fs.delete([url_t1])
>File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py",
>  line 370, in delete
>  raise BeamIOError("Delete operation failed", exceptions)
>  apache_beam.io.filesystem.BeamIOError: Delete operation failed with 
> exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size 
> during iteration',   )}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153365&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153365
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 02:14
Start Date: 11/Oct/18 02:14
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on issue #6628: [BEAM-5626] Run 
more tests in Python 3.
URL: https://github.com/apache/beam/pull/6628#issuecomment-428794469
 
 
   BEAM-5626 is solved, that's exactly why I would like to add 
hadoopfilesystem_test to the test suite.
   This PR does not skip other tests. I'll update the description.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153365)
Time Spent: 4h 40m  (was: 4.5h)

> Several IO tests fail in Python 3 with RuntimeError('dictionary changed size 
> during iteration',)}
> -
>
> Key: BEAM-5626
> URL: https://issues.apache.org/jira/browse/BEAM-5626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Ruoyun Huang
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 4h 40m
>  Remaining Estimate: 0h
>
>  ERROR: test_delete_dir 
> (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest)
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py",
>  line 506, in test_delete_dir
>  self.fs.delete([url_t1])
>File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py",
>  line 370, in delete
>  raise BeamIOError("Delete operation failed", exceptions)
>  apache_beam.io.filesystem.BeamIOError: Delete operation failed with 
> exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size 
> during iteration',   )}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Python_Verify #6237

2018-10-10 Thread Apache Jenkins Server
See 




[jira] [Work logged] (BEAM-5626) Several IO tests fail in Python 3 with RuntimeError('dictionary changed size during iteration',)}

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5626?focusedWorklogId=153364&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153364
 ]

ASF GitHub Bot logged work on BEAM-5626:


Author: ASF GitHub Bot
Created on: 11/Oct/18 02:09
Start Date: 11/Oct/18 02:09
Worklog Time Spent: 10m 
  Work Description: manuzhang commented on issue #6628: [BEAM-5626] Run 
more tests in Python 3.
URL: https://github.com/apache/beam/pull/6628#issuecomment-428793648
 
 
   Isn't [BEAM-5626](https://issues.apache.org/jira/browse/BEAM-5626) already 
solved ? It seems this PR also skips some tests. Could you please give a more 
meaningful description ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153364)
Time Spent: 4.5h  (was: 4h 20m)

> Several IO tests fail in Python 3 with RuntimeError('dictionary changed size 
> during iteration',)}
> -
>
> Key: BEAM-5626
> URL: https://issues.apache.org/jira/browse/BEAM-5626
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Ruoyun Huang
>Priority: Major
> Fix For: 2.8.0
>
>  Time Spent: 4.5h
>  Remaining Estimate: 0h
>
>  ERROR: test_delete_dir 
> (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest)
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem_test.py",
>  line 506, in test_delete_dir
>  self.fs.delete([url_t1])
>File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/hadoopfilesystem.py",
>  line 370, in delete
>  raise BeamIOError("Delete operation failed", exceptions)
>  apache_beam.io.filesystem.BeamIOError: Delete operation failed with 
> exceptions {'hdfs://test_dir/new_dir1': RuntimeError('dictionary changed size 
> during iteration',   )}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5707) Add a portable Flink streaming synthetic source for testing

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5707?focusedWorklogId=153363&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153363
 ]

ASF GitHub Bot logged work on BEAM-5707:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:50
Start Date: 11/Oct/18 01:50
Worklog Time Spent: 10m 
  Work Description: tweise commented on a change in pull request #6637: 
[BEAM-5707] Add a periodic, streaming impulse source for Flink portable 
pipelines
URL: https://github.com/apache/beam/pull/6637#discussion_r224292672
 
 

 ##
 File path: 
runners/core-construction-java/src/main/java/org/apache/beam/runners/core/construction/PTransformTranslation.java
 ##
 @@ -63,6 +64,8 @@
   public static final String GROUP_BY_KEY_TRANSFORM_URN =
   getUrn(StandardPTransforms.Primitives.GROUP_BY_KEY);
   public static final String IMPULSE_TRANSFORM_URN = 
getUrn(StandardPTransforms.Primitives.IMPULSE);
+  public static final String STREAMING_IMPULSE_TRANSFORM_URL = 
"flink:transform:streaming_impulse:v1";
 
 Review comment:
   It isn't necessary to modify this class. The URL is Flink runner specific 
and should live with the Flink runner.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153363)
Time Spent: 50m  (was: 40m)

> Add a portable Flink streaming synthetic source for testing
> ---
>
> Key: BEAM-5707
> URL: https://issues.apache.org/jira/browse/BEAM-5707
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Micah Wylde
>Assignee: Aljoscha Krettek
>Priority: Minor
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> Currently there are no built-in streaming sources for portable pipelines. 
> This makes it hard to test streaming functionality in the Python SDK.
> It would be very useful to add a periodic impulse source that (with some 
> configurable frequency) outputs an empty byte array, which can then be 
> transformed as desired inside the python pipeline. More context in this 
> [mailing list 
> discussion|https://lists.apache.org/thread.html/b44a648ab1d0cb200d8bfe4b280e9dad6368209c4725609cbfbbe410@%3Cdev.beam.apache.org%3E].



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5624) Avro IO does not work with avro-python3 package out-of-the-box on Python 3, several tests fail with AttributeError (module 'avro.schema' has no attribute 'parse')

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5624?focusedWorklogId=153362&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153362
 ]

ASF GitHub Bot logged work on BEAM-5624:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:50
Start Date: 11/Oct/18 01:50
Worklog Time Spent: 10m 
  Work Description: tvalentyn commented on a change in pull request #6616: 
[BEAM-5624] Fix avro.schema parser for py3
URL: https://github.com/apache/beam/pull/6616#discussion_r224292624
 
 

 ##
 File path: sdks/python/apache_beam/io/avroio_test.py
 ##
 @@ -25,10 +25,15 @@
 from builtins import range
 
 import avro.datafile
-import avro.schema
 from avro.datafile import DataFileWriter
 from avro.io import DatumWriter
 import hamcrest as hc
+# pylint: disable=wrong-import-order, wrong-import-position, ungrouped-imports
+try:
+  from avro.schema import Parse
 
 Review comment:
   also `apache_beam/examples/fastavro_it_test.py`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153362)
Time Spent: 40m  (was: 0.5h)

> Avro IO does not work with avro-python3 package out-of-the-box on Python 3, 
> several tests fail with AttributeError (module 'avro.schema' has no attribute 
> 'parse') 
> ---
>
> Key: BEAM-5624
> URL: https://issues.apache.org/jira/browse/BEAM-5624
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Assignee: Simon
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> ==
> ERROR: Failure: AttributeError (module 'avro.schema' has no attribute 'parse')
> --
> Traceback (most recent call last):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/site-packages/nose/failure.py",
>  line 39, in runTest
> raise self.exc_val.with_traceback(self.tb)
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/site-packages/nose/loader.py",
>  line 418, in loadTestsFromName
> addr.filename, addr.module)
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/site-packages/nose/importer.py",
>  line 47, in importFromPath
> return self.importFromDir(dir_path, fqname)
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/site-packages/nose/importer.py",
>  line 94, in importFromDir
> mod = load_module(part_fqname, fh, filename, desc)
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/imp.py",
>  line 234, in load_module
> return load_source(name, filename, file)
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/target/.tox/py3/lib/python3.5/imp.py",
>  line 172, in load_source
> module = _load(spec)
>   File "", line 693, in _load
>   File "", line 673, in _load_unlocked
>   File "", line 673, in exec_module
>   File "", line 222, in _call_with_frames_removed
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/avroio_test.py",
>  line 54, in 
> class TestAvro(unittest.TestCase):
>   File 
> "/usr/local/google/home/valentyn/projects/beam/clean_head/beam/sdks/python/apache_beam/io/avroio_test.py",
>  line 89, in TestAvro
> SCHEMA = avro.schema.parse('''
> AttributeError: module 'avro.schema' has no attribute 'parse'
> Note that we use a different implementation of avro/avro-python3 package 
> depending on Python version. We are also evaluating potential replacement of 
> avro with fastavro.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5653) Dataflow FnApi Worker overrides some of Coders due to coder ID generation collision.

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5653?focusedWorklogId=153361&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153361
 ]

ASF GitHub Bot logged work on BEAM-5653:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:48
Start Date: 11/Oct/18 01:48
Worklog Time Spent: 10m 
  Work Description: Ardagan commented on issue #6649: [BEAM-5653] Fix 
overriding coders due to duplicate coderId generation
URL: https://github.com/apache/beam/pull/6649#issuecomment-428789975
 
 
   R: @ajamato, @kennknowles


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153361)
Time Spent: 1h 50m  (was: 1h 40m)
Remaining Estimate: 70h 10m  (was: 70h 20m)

> Dataflow FnApi Worker overrides some of Coders due to coder ID generation 
> collision.
> 
>
> Key: BEAM-5653
> URL: https://issues.apache.org/jira/browse/BEAM-5653
> Project: Beam
>  Issue Type: Test
>  Components: java-fn-execution
>Reporter: Mikhail Gryzykhin
>Assignee: Mikhail Gryzykhin
>Priority: Blocker
> Fix For: 2.8.0
>
>   Original Estimate: 72h
>  Time Spent: 1h 50m
>  Remaining Estimate: 70h 10m
>
> Due to one of latest refactorings, we got a bug in Java FnApi Worker that it 
> overrides Coders in ProcessBundleDescriptor sent to SDK Harness that causes 
> jobs to fail.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5467) Python Flink ValidatesRunner job fixes

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5467?focusedWorklogId=153359&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153359
 ]

ASF GitHub Bot logged work on BEAM-5467:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:47
Start Date: 11/Oct/18 01:47
Worklog Time Spent: 10m 
  Work Description: tweise commented on a change in pull request #6532: 
[BEAM-5467] Use process SDKHarness to run flink PVR tests.
URL: https://github.com/apache/beam/pull/6532#discussion_r224292349
 
 

 ##
 File path: sdks/python/build.gradle
 ##
 @@ -379,3 +405,21 @@ task buildSnapshot() {
   dependsOn 'sdist'
   dependsOn 'depSnapshot'
 }
+
+project.task('createProcessWorker') {
+  dependsOn ':beam-sdks-python-container:build'
+  dependsOn 'setupVirtualenv'
+  def sdkWorkerFile = file("${project.buildDir}/sdk_worker.sh")
+  def workerScript = 
"${project(":beam-sdks-python-container:").buildDir.absolutePath}/target/launcher/linux_amd64/boot"
+  def sdkWorkerFileCode = "sh -c \". ${envdir}/bin/activate && ${workerScript} 
\$* \""
 
 Review comment:
   The command can be supplied to `environment_config` directly.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153359)
Time Spent: 9h  (was: 8h 50m)

> Python Flink ValidatesRunner job fixes
> --
>
> Key: BEAM-5467
> URL: https://issues.apache.org/jira/browse/BEAM-5467
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Thomas Weise
>Assignee: Thomas Weise
>Priority: Minor
>  Labels: portability-flink
>  Time Spent: 9h
>  Remaining Estimate: 0h
>
> Add status to README
> Rename script and job for consistency
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5653) Dataflow FnApi Worker overrides some of Coders due to coder ID generation collision.

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5653?focusedWorklogId=153360&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153360
 ]

ASF GitHub Bot logged work on BEAM-5653:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:47
Start Date: 11/Oct/18 01:47
Worklog Time Spent: 10m 
  Work Description: Ardagan opened a new pull request #6649: [BEAM-5653] 
Fix overriding coders due to duplicate coderId generation
URL: https://github.com/apache/beam/pull/6649
 
 
   Pre-fill SdkComponents with existing components in pipeline.
   
   
   
   Follow this checklist to help us incorporate your contribution quickly and 
easily:
   
- [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in 
ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA 
issue, if applicable. This will automatically link the pull request to the 
issue.
- [ ] If this contribution is large, please file an Apache [Individual 
Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   It will help us expedite review of your Pull Request if you tag someone 
(e.g. `@username`) to look at it.
   
   Post-Commit Tests Status (on master branch)
   

   
   Lang | SDK | Apex | Dataflow | Flink | Gearpump | Samza | Spark
   --- | --- | --- | --- | --- | --- | --- | ---
   Go | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Go_GradleBuild/lastCompletedBuild/)
 | --- | --- | --- | --- | --- | ---
   Java | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Apex_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza_Gradle/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/lastCompletedBuild/)
   Python | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_Verify/lastCompletedBuild/)
 | --- | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)
  [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/)
 | [![Build 
Status](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/)
 | --- | --- | ---
   
   
   
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153360)
Time Spent: 1h 40m  (was: 1.5h)
Remaining Estimate: 70h 20m  (was: 70.5h)

> Dataflow FnApi Worker overrides some of Coders due to coder ID generation 
> collision.
> 
>
> Key: BEAM-5653
> URL: https://issues.apache.org/jira/brows

[jira] [Work logged] (BEAM-5467) Python Flink ValidatesRunner job fixes

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5467?focusedWorklogId=153358&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153358
 ]

ASF GitHub Bot logged work on BEAM-5467:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:43
Start Date: 11/Oct/18 01:43
Worklog Time Spent: 10m 
  Work Description: tweise commented on a change in pull request #6532: 
[BEAM-5467] Use process SDKHarness to run flink PVR tests.
URL: https://github.com/apache/beam/pull/6532#discussion_r224292004
 
 

 ##
 File path: sdks/python/build.gradle
 ##
 @@ -379,3 +405,21 @@ task buildSnapshot() {
   dependsOn 'sdist'
   dependsOn 'depSnapshot'
 }
+
+project.task('createProcessWorker') {
+  dependsOn ':beam-sdks-python-container:build'
+  dependsOn 'setupVirtualenv'
+  def sdkWorkerFile = file("${project.buildDir}/sdk_worker.sh")
+  def workerScript = 
"${project(":beam-sdks-python-container:").buildDir.absolutePath}/target/launcher/linux_amd64/boot"
+  def sdkWorkerFileCode = "sh -c \". ${envdir}/bin/activate && ${workerScript} 
\$* \""
 
 Review comment:
   Why do we create this script here?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153358)
Time Spent: 8h 50m  (was: 8h 40m)

> Python Flink ValidatesRunner job fixes
> --
>
> Key: BEAM-5467
> URL: https://issues.apache.org/jira/browse/BEAM-5467
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Thomas Weise
>Assignee: Thomas Weise
>Priority: Minor
>  Labels: portability-flink
>  Time Spent: 8h 50m
>  Remaining Estimate: 0h
>
> Add status to README
> Rename script and job for consistency
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5467) Python Flink ValidatesRunner job fixes

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5467?focusedWorklogId=153357&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153357
 ]

ASF GitHub Bot logged work on BEAM-5467:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:38
Start Date: 11/Oct/18 01:38
Worklog Time Spent: 10m 
  Work Description: tweise commented on a change in pull request #6532: 
[BEAM-5467] Use process SDKHarness to run flink PVR tests.
URL: https://github.com/apache/beam/pull/6532#discussion_r224291462
 
 

 ##
 File path: sdks/python/build.gradle
 ##
 @@ -340,24 +340,50 @@ task hdfsIntegrationTest(dependsOn: 'installGcpTest') {
   }
 }
 
+class CompatibilityMatrixConfig {
+  // Execute string pipelines.
+  boolean streaming = false
+  // Execute on Docker or Process based environment.
+  HARNESS_TYPE harnessType = HARNESS_TYPE.DOCKER
 
 Review comment:
   minor: HARNESS_TYPE => SDK_WORKER_TYPE


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153357)
Time Spent: 8h 40m  (was: 8.5h)

> Python Flink ValidatesRunner job fixes
> --
>
> Key: BEAM-5467
> URL: https://issues.apache.org/jira/browse/BEAM-5467
> Project: Beam
>  Issue Type: Improvement
>  Components: runner-flink
>Reporter: Thomas Weise
>Assignee: Thomas Weise
>Priority: Minor
>  Labels: portability-flink
>  Time Spent: 8h 40m
>  Remaining Estimate: 0h
>
> Add status to README
> Rename script and job for consistency
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5701) Port Python IT test to Github: datastore_write

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5701?focusedWorklogId=153356&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153356
 ]

ASF GitHub Bot logged work on BEAM-5701:


Author: ASF GitHub Bot
Created on: 11/Oct/18 01:37
Start Date: 11/Oct/18 01:37
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #6642: [BEAM-5701] port 
datastore_write Python integration tests to Beam
URL: https://github.com/apache/beam/pull/6642#issuecomment-428788093
 
 
   Run Python PostCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153356)
Time Spent: 2h 10m  (was: 2h)

> Port Python IT test to Github: datastore_write
> --
>
> Key: BEAM-5701
> URL: https://issues.apache.org/jira/browse/BEAM-5701
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (BEAM-5713) Flink portable runner schedules all tasks of streaming job on same task manager

2018-10-10 Thread Thomas Weise (JIRA)


[ 
https://issues.apache.org/jira/browse/BEAM-5713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16645821#comment-16645821
 ] 

Thomas Weise commented on BEAM-5713:


Can be reproduced with branch: 
https://github.com/lyft/beam/tree/micah_process_streaming_leak

Simplified single task pipeline: 
https://gist.github.com/tweise/09ec82446f74bb534d488209ad88e75f

> Flink portable runner schedules all tasks of streaming job on same task 
> manager
> ---
>
> Key: BEAM-5713
> URL: https://issues.apache.org/jira/browse/BEAM-5713
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.8.0
>Reporter: Thomas Weise
>Priority: Major
>  Labels: portability, portability-flink
>
> The cluster has 9 task managers and 144 task slots total. A simple streaming 
> pipeline with parallelism of 8 will get all tasks scheduled on the same task 
> manager, causing the host to be fully booked and the remaining cluster idle.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Java_Nexmark_Dataflow #661

2018-10-10 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-5713) Flink portable runner schedules all tasks of streaming job on same task manager

2018-10-10 Thread Thomas Weise (JIRA)
Thomas Weise created BEAM-5713:
--

 Summary: Flink portable runner schedules all tasks of streaming 
job on same task manager
 Key: BEAM-5713
 URL: https://issues.apache.org/jira/browse/BEAM-5713
 Project: Beam
  Issue Type: Bug
  Components: runner-flink
Affects Versions: 2.8.0
Reporter: Thomas Weise
Assignee: Aljoscha Krettek


The cluster has 9 task managers and 144 task slots total. A simple streaming 
pipeline with parallelism of 8 will get all tasks scheduled on the same task 
manager, causing the host to be fully booked and the remaining cluster idle.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (BEAM-5713) Flink portable runner schedules all tasks of streaming job on same task manager

2018-10-10 Thread Thomas Weise (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5713?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Weise reassigned BEAM-5713:
--

Assignee: (was: Aljoscha Krettek)

> Flink portable runner schedules all tasks of streaming job on same task 
> manager
> ---
>
> Key: BEAM-5713
> URL: https://issues.apache.org/jira/browse/BEAM-5713
> Project: Beam
>  Issue Type: Bug
>  Components: runner-flink
>Affects Versions: 2.8.0
>Reporter: Thomas Weise
>Priority: Major
>  Labels: portability, portability-flink
>
> The cluster has 9 task managers and 144 task slots total. A simple streaming 
> pipeline with parallelism of 8 will get all tasks scheduled on the same task 
> manager, causing the host to be fully booked and the remaining cluster idle.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Jenkins build is back to normal : beam_PostCommit_Go_GradleBuild #1246

2018-10-10 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #321

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[valentyn] [BEAM-5692] Exclude flaky in Python 3 tests from the suite.

--
[...truncated 51.10 MB...]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (2/16) 
(913a91d2fa8b22a72562151e2f2f2182) switched from RUNNING to FINISHED.
[ToKeyedWorkItem (15/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (15/16) 
(8fb38d40c9a3162886f0a7f39c92593e) [FINISHED]
[GroupByKey -> 24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16) 
(005a16fc1c1f826e7b7185f267ec9259) switched from RUNNING to FINISHED.
[GroupByKey -> 24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for 
GroupByKey -> 24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16) 
(005a16fc1c1f826e7b7185f267ec9259).
[GroupByKey -> 24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16)] 
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem 
streams are closed for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (4/16) 
(005a16fc1c1f826e7b7185f267ec9259) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (6/16) 
(a218ed9e1d098b1bbae2f810c36fdc18) switched from RUNNING to FINISHED.
[ToKeyedWorkItem (10/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (10/16) (2eb07b78eead79dce7529f4ebb0bd150) switched from 
RUNNING to FINISHED.
[ToKeyedWorkItem (10/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (10/16) 
(2eb07b78eead79dce7529f4ebb0bd150).
[ToKeyedWorkItem (10/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (10/16) 
(2eb07b78eead79dce7529f4ebb0bd150) [FINISHED]
[ToKeyedWorkItem (8/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (8/16) (3c57309fc36197c7a3ea618194d14ce3) switched from RUNNING 
to FINISHED.
[ToKeyedWorkItem (8/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (8/16) 
(3c57309fc36197c7a3ea618194d14ce3).
[ToKeyedWorkItem (8/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (8/16) 
(3c57309fc36197c7a3ea618194d14ce3) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem 
83d35d6cfc243cd696706365a70efe05.
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and 
sending final execution state FINISHED to JobManager for task GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 
7305753b3f352f17c5a526436fcaf041.
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (16/16) (6ca5e959e861441ac518a241417b098a) switched from 
RUNNING to FINISHED.
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (16/16) 
(6ca5e959e861441ac518a241417b098a).
[ToKeyedWorkItem (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (13/16) (976de8320387aee5f1de93c6a5eaadc0) switched from 
RUNNING to FINISHED.
[ToKeyedWorkItem (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Freeing task resources for ToKeyedWorkItem (13/16) 
(976de8320387aee5f1de93c6a5eaadc0).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> 
24GroupByKey/GroupByWindow.None/beam:env:docker:v1:0 (5/16) 
(a213b640cb305fd0abe3284266c582d8) switched from RUNNING to FINISHED.
[ToKeyedWorkItem (13/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (13/16) 
(976de8320387aee5f1de93c6a5eaadc0) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem 
(12/16) (83d35d6cfc243cd696706365a70efe05) switched from RUNNING to FINISHED.
[ToKeyedWorkItem (16/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (16/16) 
(6ca5e959e861441ac518a241417b098a) [FINISHED]
[ToKeyedWorkItem (4/16)] INFO org.apache.flink.runtime.taskmanager.Task - 
ToKeyedWorkItem (4/16) (8d13650a2855c926ad4e686e6f17d11d) switched from RUNNING 
t

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1661

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[kedin] Revert "Merge pull request #6582 [BEAM-5702] Special case zero and one

[github] minor doc change to reflect the current filename

--
[...truncated 50.53 MB...]
at 
io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at 
io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at 
io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at 
io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:403)
at 
io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467)
at 
io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at 
io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
... 3 more

Oct 11, 2018 1:11:02 AM 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn processElement
WARNING: Failed to submit the mutation group
com.google.cloud.spanner.SpannerException: FAILED_PRECONDITION: 
io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Value must not be NULL in 
table users.
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerExceptionPreformatted(SpannerExceptionFactory.java:119)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:43)
at 
com.google.cloud.spanner.SpannerExceptionFactory.newSpannerException(SpannerExceptionFactory.java:80)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:456)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.commit(GrpcSpannerRpc.java:404)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl$2.call(SpannerImpl.java:797)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl$2.call(SpannerImpl.java:794)
at 
com.google.cloud.spanner.SpannerImpl.runWithRetries(SpannerImpl.java:227)
at 
com.google.cloud.spanner.SpannerImpl$SessionImpl.writeAtLeastOnce(SpannerImpl.java:793)
at 
com.google.cloud.spanner.SessionPool$PooledSession.writeAtLeastOnce(SessionPool.java:319)
at 
com.google.cloud.spanner.DatabaseClientImpl.writeAtLeastOnce(DatabaseClientImpl.java:60)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn.processElement(SpannerIO.java:1108)
at 
org.apache.beam.sdk.io.gcp.spanner.SpannerIO$WriteToSpannerFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
at 
org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at 
org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at 
org.apache.beam.repackaged.beam_runners_direct_java.runners.core.SimplePushbackSideInputDoFnRunner.processElementInReadyWindows(SimplePushbackSideInputDoFnRunner.java:78)
at 
org.apache.beam.runners.direct.ParDoEvaluator.processElement(ParDoEvaluator.java:207)
at 
org.apache.beam.runners.direct.DoFnLifecycleManagerRemovingTransformEvaluator.processElement(DoFnLifecycleManagerRemovingTransformEvaluator.java:55)
at 
org.apache.beam.runners.direct.DirectTransformExecutor.processElements(DirectTransformExecutor.java:160)
at 
org.apache.beam.runners.direct.DirectTransformExecutor.run(DirectTransformExecutor.java:124)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: 
io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Value must not be NULL in 
table users.
at 
com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
at 
com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:479)
at 
com.google.cloud.spanner.spi.v1.GrpcSpannerRpc.get(GrpcSpannerRpc.java:450)
... 21 more
Caused by: io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Value must 
not be NULL in table users.
at io.grpc.Status

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1333

2018-10-10 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PostCommit_Python_Verify #6236

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] [BEAM-5700] remove the extra licenses from python bigquery IT

--
[...truncated 1.32 MB...]
test_emulated_iterable 
(apache_beam.runners.worker.sideinputs_test.EmulatedCollectionsTest) ... ok
test_large_iterable_values 
(apache_beam.runners.worker.sideinputs_test.EmulatedCollectionsTest) ... ok
test_bytes_read_are_reported 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_multiple_sources_iterator_fn 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_multiple_sources_single_reader_iterator_fn 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_single_source_iterator_fn 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_source_iterator_fn_exception 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_source_iterator_single_source_exception 
(apache_beam.runners.worker.sideinputs_test.PrefetchingSourceIteratorTest) ... 
ok
test_basic_sampler 
(apache_beam.runners.worker.statesampler_test.StateSamplerTest) ... ok
test_sampler_transition_overhead 
(apache_beam.runners.worker.statesampler_test.StateSamplerTest) ... ok
test_failure_when_worker_id_exists 
(apache_beam.runners.worker.worker_id_interceptor_test.WorkerIdInterceptorTest) 
... ok
test_worker_id_insertion 
(apache_beam.runners.worker.worker_id_interceptor_test.WorkerIdInterceptorTest) 
... ok
test_dofn_validate_finish_bundle_error 
(apache_beam.runners.common_test.DoFnSignatureTest) ... ok
test_dofn_validate_process_error 
(apache_beam.runners.common_test.DoFnSignatureTest) ... ok
test_dofn_validate_start_bundle_error 
(apache_beam.runners.common_test.DoFnSignatureTest) ... ok
test_deduplication 
(apache_beam.runners.pipeline_context_test.PipelineContextTest) ... ok
test_serialization 
(apache_beam.runners.pipeline_context_test.PipelineContextTest) ... ok
test_create_runner (apache_beam.runners.runner_test.RunnerTest) ... ok
test_create_runner_shorthand (apache_beam.runners.runner_test.RunnerTest) ... ok
test_direct_runner_metrics (apache_beam.runners.runner_test.RunnerTest) ... ok
test_run_api (apache_beam.runners.runner_test.RunnerTest) ... ok
test_run_api_with_callable (apache_beam.runners.runner_test.RunnerTest) ... ok
test_delete_table_fails_dataset_not_exist 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_fails_service_error 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_fails_table_not_exist 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies 

[beam] branch asf-site updated: Publishing website 2018/10/11 01:03:11 at commit 4e0c9a1

2018-10-10 Thread git-site-role
This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 8feb4da  Publishing website 2018/10/11 01:03:11 at commit 4e0c9a1
8feb4da is described below

commit 8feb4da08e13eea040ffc5cd3721f5a78c1ccfb2
Author: jenkins 
AuthorDate: Thu Oct 11 01:03:12 2018 +

Publishing website 2018/10/11 01:03:11 at commit 4e0c9a1



[beam] branch master updated (223372c -> 4e0c9a1)

2018-10-10 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 223372c  Merge pull request #6627 from 
yifanzou/BEAM-5700/remove_extra_licenses
 add f5e7b12  [BEAM-5692] Exclude flaky in Python 3 tests from the suite.
 new 4e0c9a1  Merge pull request #6645 from tvalentyn/exclude_flaky_tests

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../runners/portability/fn_api_runner_test.py  | 24 ++
 .../runners/portability/portable_runner_test.py|  5 +
 2 files changed, 29 insertions(+)



[beam] 01/01: Merge pull request #6645 from tvalentyn/exclude_flaky_tests

2018-10-10 Thread altay
This is an automated email from the ASF dual-hosted git repository.

altay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 4e0c9a1050ad2dedee82726ee53bf2e422d60482
Merge: 223372c f5e7b12
Author: Ahmet Altay 
AuthorDate: Wed Oct 10 18:01:12 2018 -0700

Merge pull request #6645 from tvalentyn/exclude_flaky_tests

[BEAM-5692] Exclude flaky in Python 3 tests from the suite.

 .../runners/portability/fn_api_runner_test.py  | 24 ++
 .../runners/portability/portable_runner_test.py|  5 +
 2 files changed, 29 insertions(+)



[jira] [Work logged] (BEAM-5701) Port Python IT test to Github: datastore_write

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5701?focusedWorklogId=153345&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153345
 ]

ASF GitHub Bot logged work on BEAM-5701:


Author: ASF GitHub Bot
Created on: 11/Oct/18 00:42
Start Date: 11/Oct/18 00:42
Worklog Time Spent: 10m 
  Work Description: yifanzou removed a comment on issue #6642: [BEAM-5701] 
port datastore_write Python integration tests to Beam
URL: https://github.com/apache/beam/pull/6642#issuecomment-428776669
 
 
   Run Python PostCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153345)
Time Spent: 2h  (was: 1h 50m)

> Port Python IT test to Github: datastore_write
> --
>
> Key: BEAM-5701
> URL: https://issues.apache.org/jira/browse/BEAM-5701
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
>  Time Spent: 2h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5701) Port Python IT test to Github: datastore_write

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5701?focusedWorklogId=153344&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153344
 ]

ASF GitHub Bot logged work on BEAM-5701:


Author: ASF GitHub Bot
Created on: 11/Oct/18 00:42
Start Date: 11/Oct/18 00:42
Worklog Time Spent: 10m 
  Work Description: yifanzou commented on issue #6642: [BEAM-5701] port 
datastore_write Python integration tests to Beam
URL: https://github.com/apache/beam/pull/6642#issuecomment-428779061
 
 
   Run Python PostCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153344)
Time Spent: 1h 50m  (was: 1h 40m)

> Port Python IT test to Github: datastore_write
> --
>
> Key: BEAM-5701
> URL: https://issues.apache.org/jira/browse/BEAM-5701
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1332

2018-10-10 Thread Apache Jenkins Server
See 


Changes:

[yifanzou] [BEAM-5700] remove the extra licenses from python bigquery IT

--
[...truncated 84.55 KB...]
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2))
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded 
/tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2))
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
  File was already downloaded /tmp/dataflow-requirements-cache/pbr-4.3.0.tar.gz
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
:54:
 DeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  print('Found: %s.' % self.build_console_url(pipeline.options))
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
WARNING:root:Waiting indefinitely for streaming job.
test_as_list_twice (apache_

[jira] [Work logged] (BEAM-5623) Several IO tests hang indefinitely during execution on Python 3.

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5623?focusedWorklogId=153343&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153343
 ]

ASF GitHub Bot logged work on BEAM-5623:


Author: ASF GitHub Bot
Created on: 11/Oct/18 00:29
Start Date: 11/Oct/18 00:29
Worklog Time Spent: 10m 
  Work Description: tvalentyn edited a comment on issue #6648: [BEAM-5623] 
Skip tests that halt test suite execution on Python 3
URL: https://github.com/apache/beam/pull/6648#issuecomment-428773705
 
 
   R: @aaltay 
   cc: @Fematich @Juta @manuzhang @splovyt


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153343)
Time Spent: 0.5h  (was: 20m)

> Several IO tests hang indefinitely during execution on Python 3.
> 
>
> Key: BEAM-5623
> URL: https://issues.apache.org/jira/browse/BEAM-5623
> Project: Beam
>  Issue Type: Sub-task
>  Components: sdk-py-core
>Reporter: Valentyn Tymofieiev
>Priority: Major
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> test_read_empty_single_file_no_eol_gzip 
> (apache_beam.io.textio_test.TextSourceTest) 
> Also several tests cases in tfrecordio_test, for example:
> test_process_auto (apache_beam.io.tfrecordio_test.TestReadAllFromTFRecord)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5701) Port Python IT test to Github: datastore_write

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5701?focusedWorklogId=153341&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153341
 ]

ASF GitHub Bot logged work on BEAM-5701:


Author: ASF GitHub Bot
Created on: 11/Oct/18 00:27
Start Date: 11/Oct/18 00:27
Worklog Time Spent: 10m 
  Work Description: yifanzou removed a comment on issue #6642: [BEAM-5701] 
port datastore_write Python integration tests to Beam
URL: https://github.com/apache/beam/pull/6642#issuecomment-428761072
 
 
   Run Python PostCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153341)
Time Spent: 1.5h  (was: 1h 20m)

> Port Python IT test to Github: datastore_write
> --
>
> Key: BEAM-5701
> URL: https://issues.apache.org/jira/browse/BEAM-5701
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work logged] (BEAM-5701) Port Python IT test to Github: datastore_write

2018-10-10 Thread ASF GitHub Bot (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-5701?focusedWorklogId=153342&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-153342
 ]

ASF GitHub Bot logged work on BEAM-5701:


Author: ASF GitHub Bot
Created on: 11/Oct/18 00:27
Start Date: 11/Oct/18 00:27
Worklog Time Spent: 10m 
  Work Description: yifanzou removed a comment on issue #6642: [BEAM-5701] 
port datastore_write Python integration tests to Beam
URL: https://github.com/apache/beam/pull/6642#issuecomment-428768274
 
 
   Run Python PostCommit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
---

Worklog Id: (was: 153342)
Time Spent: 1h 40m  (was: 1.5h)

> Port Python IT test to Github: datastore_write
> --
>
> Key: BEAM-5701
> URL: https://issues.apache.org/jira/browse/BEAM-5701
> Project: Beam
>  Issue Type: Bug
>  Components: testing
>Reporter: yifan zou
>Assignee: yifan zou
>Priority: Major
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   3   4   >