[beam] branch master updated (79496b2 -> 84d57ca)

2021-11-08 Thread aromanenko
This is an automated email from the ASF dual-hosted git repository.

aromanenko pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 79496b2  Merge pull request #15913: [BEAM-2791] remove spammy log 
statement
 add 2a94534  [BEAM-13157] add regression test for hadoop configuration on 
ParquetIO.Parse
 new 84d57ca  Merge pull request #15914: [BEAM-13157] add regression test 
for hadoop configuration on ParquetIO.Parse

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../apache/beam/sdk/io/parquet/ParquetIOTest.java  | 28 ++
 1 file changed, 28 insertions(+)


[beam] 01/01: Merge pull request #15914: [BEAM-13157] add regression test for hadoop configuration on ParquetIO.Parse

2021-11-08 Thread aromanenko
This is an automated email from the ASF dual-hosted git repository.

aromanenko pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 84d57ca36294f1bc29d22a4ac4282a45846c
Merge: 79496b2 2a94534
Author: Alexey Romanenko <33895511+aromanenko-...@users.noreply.github.com>
AuthorDate: Mon Nov 8 15:44:49 2021 +0100

Merge pull request #15914: [BEAM-13157] add regression test for hadoop 
configuration on ParquetIO.Parse

 .../apache/beam/sdk/io/parquet/ParquetIOTest.java  | 28 ++
 1 file changed, 28 insertions(+)


[beam] branch master updated: Merge pull request #15873 from [BEAM-13181] Remove Sharding from FhirIO.Import

2021-11-08 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 5e44a13  Merge pull request #15873 from [BEAM-13181] Remove Sharding 
from FhirIO.Import
5e44a13 is described below

commit 5e44a133ce69c95aa434536b73ae6f757c243cce
Author: Milena Bukal 
AuthorDate: Mon Nov 8 15:19:09 2021 -0500

Merge pull request #15873 from [BEAM-13181] Remove Sharding from 
FhirIO.Import

* Test FhirIO improvements

* Remove batching from FhirIO.Import

* Minor cleanup

* Fix tmpGcsPath input
---
 .../apache/beam/sdk/io/gcp/healthcare/FhirIO.java  | 236 ++---
 1 file changed, 108 insertions(+), 128 deletions(-)

diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/healthcare/FhirIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/healthcare/FhirIO.java
index a55cfb4..609bb61 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/healthcare/FhirIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/healthcare/FhirIO.java
@@ -40,12 +40,9 @@ import java.util.Map;
 import java.util.NoSuchElementException;
 import java.util.Optional;
 import java.util.UUID;
-import java.util.concurrent.ThreadLocalRandom;
 import org.apache.beam.sdk.Pipeline;
-import org.apache.beam.sdk.coders.IterableCoder;
 import org.apache.beam.sdk.coders.KvCoder;
 import org.apache.beam.sdk.coders.StringUtf8Coder;
-import org.apache.beam.sdk.coders.TextualIntegerCoder;
 import org.apache.beam.sdk.coders.VoidCoder;
 import org.apache.beam.sdk.io.FileIO;
 import org.apache.beam.sdk.io.FileSystems;
@@ -57,7 +54,6 @@ import org.apache.beam.sdk.io.fs.MatchResult.Status;
 import org.apache.beam.sdk.io.fs.MoveOptions.StandardMoveOptions;
 import org.apache.beam.sdk.io.fs.ResolveOptions.StandardResolveOptions;
 import org.apache.beam.sdk.io.fs.ResourceId;
-import org.apache.beam.sdk.io.fs.ResourceIdCoder;
 import 
org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HealthcareHttpException;
 import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO;
 import org.apache.beam.sdk.metrics.Counter;
@@ -67,12 +63,10 @@ import org.apache.beam.sdk.options.ValueProvider;
 import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
 import org.apache.beam.sdk.transforms.Create;
 import org.apache.beam.sdk.transforms.DoFn;
-import org.apache.beam.sdk.transforms.GroupIntoBatches;
 import org.apache.beam.sdk.transforms.MapElements;
 import org.apache.beam.sdk.transforms.PTransform;
 import org.apache.beam.sdk.transforms.ParDo;
 import org.apache.beam.sdk.transforms.Wait;
-import org.apache.beam.sdk.transforms.WithKeys;
 import org.apache.beam.sdk.transforms.windowing.BoundedWindow;
 import org.apache.beam.sdk.values.KV;
 import org.apache.beam.sdk.values.PBegin;
@@ -100,15 +94,15 @@ import org.slf4j.LoggerFactory;
  *
  * Reading
  *
- * FHIR resources can be read with {@link FhirIO.Read}, which supports use 
cases where you have a
- * ${@link PCollection} of message IDs. This is appropriate for reading the 
Fhir notifications from
- * a Pub/Sub subscription with {@link PubsubIO#readStrings()} or in cases 
where you have a manually
- * prepared list of messages that you need to process (e.g. in a text file 
read with {@link
+ * FHIR resources can be read with {@link FhirIO.Read}, which supports use 
cases where you have
+ * a ${@link PCollection} of message IDs. This is appropriate for reading the 
Fhir notifications
+ * from a Pub/Sub subscription with {@link PubsubIO#readStrings()} or in cases 
where you have a
+ * manually prepared list of messages that you need to process (e.g. in a text 
file read with {@link
  * org.apache.beam.sdk.io.TextIO}*) .
  *
- * Fetch Resource contents from Fhir Store based on the {@link PCollection} 
of message ID strings
- * {@link FhirIO.Read.Result} where one can call {@link 
Read.Result#getResources()} to retrieve a
- * {@link PCollection} containing the successfully fetched {@link String}s 
and/or {@link
+ * Fetch Resource contents from Fhir Store based on the {@link PCollection} 
of message ID
+ * strings {@link FhirIO.Read.Result} where one can call {@link 
Read.Result#getResources()} to
+ * retrieve a {@link PCollection} containing the successfully fetched {@link 
String}s and/or {@link
  * FhirIO.Read.Result#getFailedReads()}* to retrieve a {@link PCollection} of 
{@link
  * HealthcareIOError}* containing the resource ID that could not be fetched 
and the exception as a
  * {@link HealthcareIOError}, this can be used to write to the dead letter 
storage system of your
@@ -119,8 +113,8 @@ import org.slf4j.LoggerFactory;
  *
  * Write Resources can be written to FHIR with two different methods: 
Import or Execute Bundle.
  *
- * Execute Bundle This is b

[beam] branch master updated: [BEAM-13155][Playground] add a new status: STATUS_RUN_ERROR; add a new API method GetRunError; add a new sunKey RunError; update of validation processing;

2021-11-08 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new 2420585  [BEAM-13155][Playground] add a new status: STATUS_RUN_ERROR; 
add a new API method GetRunError; add a new sunKey RunError; update of 
validation processing;
 new f068a62  Merge pull request #15879 from [BEAM-13155][Playground] 
Update the processing of error during run code
2420585 is described below

commit 24205850efc35caff49f8adbbf04f79e625b575c
Author: AydarZaynutdinov 
AuthorDate: Wed Nov 3 18:56:24 2021 +0300

[BEAM-13155][Playground]
add a new status: STATUS_RUN_ERROR;
add a new API method GetRunError;
add a new sunKey RunError;
update of validation processing;
---
 playground/api/v1/api.proto|  19 +-
 playground/backend/cmd/server/controller.go|  41 +-
 playground/backend/cmd/server/controller_test.go   | 173 ++--
 playground/backend/internal/api/v1/api.pb.go   | 479 +
 playground/backend/internal/api/v1/api_grpc.pb.go  |  38 ++
 playground/backend/internal/cache/cache.go |   3 +
 .../backend/internal/cache/redis/redis_cache.go|   2 +-
 playground/frontend/lib/api/v1/api.pb.dart | 106 -
 playground/frontend/lib/api/v1/api.pbenum.dart |   6 +-
 playground/frontend/lib/api/v1/api.pbgrpc.dart |  28 ++
 playground/frontend/lib/api/v1/api.pbjson.dart |  30 +-
 11 files changed, 666 insertions(+), 259 deletions(-)

diff --git a/playground/api/v1/api.proto b/playground/api/v1/api.proto
index e0e164f..d716ca3 100644
--- a/playground/api/v1/api.proto
+++ b/playground/api/v1/api.proto
@@ -37,8 +37,9 @@ enum Status {
   STATUS_COMPILE_ERROR = 4;
   STATUS_EXECUTING = 5;
   STATUS_FINISHED = 6;
-  STATUS_ERROR = 7;
-  STATUS_RUN_TIMEOUT = 8;
+  STATUS_RUN_ERROR = 7;
+  STATUS_ERROR = 8;
+  STATUS_RUN_TIMEOUT = 9;
 }
 
 enum ExampleType {
@@ -87,7 +88,16 @@ message GetRunOutputRequest {
 // RunOutputResponse represents the result of the executed code.
 message GetRunOutputResponse {
   string output = 1;
-  Status compilation_status = 2;
+}
+
+// GetRunErrorRequest contains information of the pipeline uuid.
+message GetRunErrorRequest {
+  string pipeline_uuid = 1;
+}
+
+// GetRunErrorResponse represents the error of the executed code.
+message GetRunErrorResponse {
+  string output = 1;
 }
 
 // ListOfExamplesRequest contains information of the needed examples sdk and 
categories.
@@ -140,6 +150,9 @@ service PlaygroundService {
   // Get the result of pipeline execution.
   rpc GetRunOutput(GetRunOutputRequest) returns (GetRunOutputResponse);
 
+  // Get the error of pipeline execution.
+  rpc GetRunError(GetRunErrorRequest) returns (GetRunErrorResponse);
+
   // Get the result of pipeline compilation.
   rpc GetCompileOutput(GetCompileOutputRequest) returns 
(GetCompileOutputResponse);
 
diff --git a/playground/backend/cmd/server/controller.go 
b/playground/backend/cmd/server/controller.go
index ad964b2..5b92d2b 100644
--- a/playground/backend/cmd/server/controller.go
+++ b/playground/backend/cmd/server/controller.go
@@ -104,6 +104,23 @@ func (controller *playgroundController) GetRunOutput(ctx 
context.Context, info *
return &pipelineResult, nil
 }
 
+//GetRunError is returning error output of execution for specific pipeline by 
PipelineUuid
+func (controller *playgroundController) GetRunError(ctx context.Context, info 
*pb.GetRunErrorRequest) (*pb.GetRunErrorResponse, error) {
+   pipelineId := info.PipelineUuid
+   runErrorInterface, err := controller.cacheService.GetValue(ctx, 
uuid.MustParse(pipelineId), cache.RunError)
+   if err != nil {
+   logger.Errorf("%s: GetRunError(): cache.GetValue: error: %s", 
pipelineId, err.Error())
+   return nil, errors.NotFoundError("GetRunError", "there is no 
run error output for pipelineId: "+pipelineId+", subKey: cache.RunError")
+   }
+   runError, converted := runErrorInterface.(string)
+   if !converted {
+   return nil, errors.InternalError("GetRunError", "run output 
can't be converted to string")
+   }
+   pipelineResult := pb.GetRunErrorResponse{Output: runError}
+
+   return &pipelineResult, nil
+}
+
 //GetCompileOutput is returning output of compilation for specific pipeline by 
PipelineUuid
 func (controller *playgroundController) GetCompileOutput(ctx context.Context, 
info *pb.GetCompileOutputRequest) (*pb.GetCompileOutputResponse, error) {
pipelineId := info.PipelineUuid
@@ -237,10 +254,10 @@ func setupValidators(sdk pb.Sdk, filepath string) 
*[]validators.Validator {
 
 // processCode validates, compiles and runs code by pipelineId.
 // During each operation updates status of execution and saves it into cache:
-// - In case of validation step is failed saves playground.Status_STATUS_ERROR 
as cach

[beam] branch master updated (f068a62 -> cb1d5de)

2021-11-08 Thread pabloem
This is an automated email from the ASF dual-hosted git repository.

pabloem pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from f068a62  Merge pull request #15879 from [BEAM-13155][Playground] 
Update the processing of error during run code
 add cb1d5de  Merge pull request #15813 from [BEAM-13071] [Playground] 
Update run code statuses and add notification system

No new revisions were added by this update.

Summary of changes:
 playground/frontend/assets/error_notification.svg  |  27 +
 playground/frontend/assets/info_notification.svg   |  27 +
 .../assets/{reset.svg => success_notification.svg} |   9 +-
 .../frontend/assets/warning_notification.svg   |  27 +
 playground/frontend/lib/api/v1/api.pbgrpc.dart |  10 +-
 playground/frontend/lib/constants/assets.dart  |   6 ++
 playground/frontend/lib/constants/colors.dart  |   8 ++
 .../code_client/grpc_code_client.dart  |  16 ++-
 .../code_repository/code_repository.dart   |  34 +++---
 .../code_repository/run_code_result.dart   |   9 +-
 .../components/base_notification.dart  | 102 ++
 .../notifications/components/notification.dart | 117 +
 .../components/editor_textarea_wrapper.dart|   9 +-
 .../code_repository/code_repository_test.dart  |  16 ++-
 14 files changed, 377 insertions(+), 40 deletions(-)
 create mode 100644 playground/frontend/assets/error_notification.svg
 create mode 100644 playground/frontend/assets/info_notification.svg
 copy playground/frontend/assets/{reset.svg => success_notification.svg} (54%)
 create mode 100644 playground/frontend/assets/warning_notification.svg
 create mode 100644 
playground/frontend/lib/modules/notifications/components/base_notification.dart
 create mode 100644 
playground/frontend/lib/modules/notifications/components/notification.dart


[beam] branch master updated: [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a fix because they are flaky.

2021-11-08 Thread ibzib
This is an automated email from the ASF dual-hosted git repository.

ibzib pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/master by this push:
 new ea1bf40  [BEAM-5172] Temporary ignore testSplit and testSizes tests 
waiting for a fix because they are flaky.
 new 2f2e4fa  Merge pull request #15922 from ibzib/es-flake
ea1bf40 is described below

commit ea1bf40ef61b77bfd6a4d0e73fab9ae147a70f69
Author: Etienne Chauchot 
AuthorDate: Fri Sep 3 09:49:04 2021 +0200

[BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a 
fix because they are flaky.
---
 .../java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java | 3 +++
 .../java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java | 3 +++
 .../java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java | 3 +++
 3 files changed, 9 insertions(+)

diff --git 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
index 4023b24..da593bf 100644
--- 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
+++ 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
@@ -33,6 +33,7 @@ import org.elasticsearch.client.RestClient;
 import org.junit.AfterClass;
 import org.junit.Before;
 import org.junit.BeforeClass;
+import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.ExpectedException;
@@ -83,6 +84,7 @@ public class ElasticsearchIOTest implements Serializable {
 
   @Rule public TestPipeline pipeline = TestPipeline.create();
 
+  @Ignore("https://issues.apache.org/jira/browse/BEAM-5172";)
   @Test
   public void testSizes() throws Exception {
 // need to create the index using the helper method (not create it at 
first insertion)
@@ -154,6 +156,7 @@ public class ElasticsearchIOTest implements Serializable {
 elasticsearchIOTestCommon.testWriteWithMaxBatchSizeBytes();
   }
 
+  @Ignore("https://issues.apache.org/jira/browse/BEAM-5172";)
   @Test
   public void testSplit() throws Exception {
 // need to create the index using the helper method (not create it at 
first insertion)
diff --git 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
index 3e77be8..d9124cd 100644
--- 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
+++ 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-6/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
@@ -33,6 +33,7 @@ import org.elasticsearch.client.RestClient;
 import org.junit.AfterClass;
 import org.junit.Before;
 import org.junit.BeforeClass;
+import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.ExpectedException;
@@ -82,6 +83,7 @@ public class ElasticsearchIOTest implements Serializable {
 
   @Rule public TestPipeline pipeline = TestPipeline.create();
 
+  @Ignore("https://issues.apache.org/jira/browse/BEAM-5172";)
   @Test
   public void testSizes() throws Exception {
 // need to create the index using the helper method (not create it at 
first insertion)
@@ -153,6 +155,7 @@ public class ElasticsearchIOTest implements Serializable {
 elasticsearchIOTestCommon.testWriteWithMaxBatchSizeBytes();
   }
 
+  @Ignore("https://issues.apache.org/jira/browse/BEAM-5172";)
   @Test
   public void testSplit() throws Exception {
 // need to create the index using the helper method (not create it at 
first insertion)
diff --git 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-7/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-7/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
index 4d78666..761e60a 100644
--- 
a/sdks/java/io/elasticsearch-tests/elasticsearch-tests-7/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
+++ 
b/sdks/java/io/elasticsearch-tests/elasticsearch-tests-7/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTest.java
@@ -33,6 +33,7 @@ import org.elasticsearch.client.RestClient;
 import org.junit.AfterClass;
 import org.junit.Before;
 import org.junit.BeforeClass;
+import org.junit.Ignore;
 import org.junit.Rule;
 import org.junit.Test;
 import org.junit.rules.ExpectedException;
@@ -83,6 +84,7 @@ public class ElasticsearchIOTest imple

[beam] branch fad created (now 72ee5d3)

2021-11-08 Thread apilloud
This is an automated email from the ASF dual-hosted git repository.

apilloud pushed a change to branch fad
in repository https://gitbox.apache.org/repos/asf/beam.git.


  at 72ee5d3  [BEAM-13056] Expose FieldAccess in DoFnSchemaInformation

This branch includes the following new commits:

 new 72ee5d3  [BEAM-13056] Expose FieldAccess in DoFnSchemaInformation

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[beam] 01/01: [BEAM-13056] Expose FieldAccess in DoFnSchemaInformation

2021-11-08 Thread apilloud
This is an automated email from the ASF dual-hosted git repository.

apilloud pushed a commit to branch fad
in repository https://gitbox.apache.org/repos/asf/beam.git

commit 72ee5d3547d3990fbf61e9ed61aec5958effcb03
Author: Andrew Pilloud 
AuthorDate: Wed Nov 3 14:33:18 2021 -0700

[BEAM-13056] Expose FieldAccess in DoFnSchemaInformation
---
 .../beam/sdk/transforms/DoFnSchemaInformation.java |  33 +-
 .../java/org/apache/beam/sdk/transforms/ParDo.java |  11 ++
 .../org/apache/beam/sdk/transforms/ParDoTest.java  | 126 +
 .../extensions/sql/impl/rel/BeamCalcRelTest.java   |  29 +++--
 .../sql/zetasql/BeamZetaSqlCalcRelTest.java|  29 +++--
 5 files changed, 194 insertions(+), 34 deletions(-)

diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnSchemaInformation.java
 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnSchemaInformation.java
index 54f6d19..9a8ccac 100644
--- 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnSchemaInformation.java
+++ 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/DoFnSchemaInformation.java
@@ -51,10 +51,14 @@ public abstract class DoFnSchemaInformation implements 
Serializable {
*/
   public abstract List> getElementConverters();
 
+  /** Effective FieldAccessDescriptor applied by DoFn. */
+  public abstract FieldAccessDescriptor getFieldAccessDescriptor();
+
   /** Create an instance. */
   public static DoFnSchemaInformation create() {
 return new AutoValue_DoFnSchemaInformation.Builder()
 .setElementConverters(Collections.emptyList())
+.setFieldAccessDescriptor(FieldAccessDescriptor.create())
 .build();
   }
 
@@ -63,6 +67,8 @@ public abstract class DoFnSchemaInformation implements 
Serializable {
   public abstract static class Builder {
 abstract Builder setElementConverters(List> 
converters);
 
+abstract Builder setFieldAccessDescriptor(FieldAccessDescriptor 
descriptor);
+
 abstract DoFnSchemaInformation build();
   }
 
@@ -101,7 +107,10 @@ public abstract class DoFnSchemaInformation implements 
Serializable {
 unbox))
 .build();
 
-return toBuilder().setElementConverters(converters).build();
+return toBuilder()
+.setElementConverters(converters)
+.setFieldAccessDescriptor(getFieldAccessDescriptor())
+.build();
   }
 
   /**
@@ -141,7 +150,27 @@ public abstract class DoFnSchemaInformation implements 
Serializable {
 elementT))
 .build();
 
-return toBuilder().setElementConverters(converters).build();
+return toBuilder()
+.setElementConverters(converters)
+.setFieldAccessDescriptor(getFieldAccessDescriptor())
+.build();
+  }
+
+  /**
+   * Specified a descriptor of fields accessed from an input schema.
+   *
+   * @param selectDescriptor The descriptor describing which field to select.
+   * @return
+   */
+  DoFnSchemaInformation withFieldAccessDescriptor(FieldAccessDescriptor 
selectDescriptor) {
+
+FieldAccessDescriptor descriptor =
+
FieldAccessDescriptor.union(ImmutableList.of(getFieldAccessDescriptor(), 
selectDescriptor));
+
+return toBuilder()
+.setElementConverters(getElementConverters())
+.setFieldAccessDescriptor(descriptor)
+.build();
   }
 
   private static class ConversionFunction
diff --git 
a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/ParDo.java 
b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/ParDo.java
index 9da0998..97cd6cd 100644
--- a/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/ParDo.java
+++ b/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/ParDo.java
@@ -49,6 +49,8 @@ import org.apache.beam.sdk.transforms.reflect.DoFnSignature;
 import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.FieldAccessDeclaration;
 import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.MethodWithExtraParameters;
 import org.apache.beam.sdk.transforms.reflect.DoFnSignature.OnTimerMethod;
+import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.Parameter.ElementParameter;
+import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.Parameter.ProcessContextParameter;
 import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.Parameter.SchemaElementParameter;
 import 
org.apache.beam.sdk.transforms.reflect.DoFnSignature.Parameter.SideInputParameter;
 import org.apache.beam.sdk.transforms.reflect.DoFnSignatures;
@@ -637,6 +639,7 @@ public class ParDo {
   fn.getClass().getName()));
 }
   }
+
   /**
* Extract information on how the DoFn uses schemas. In particular, if the 
schema of an element
* parameter does not match the input PCollection's schema, convert.
@@ -662,6 +665,7 @@ public class ParDo {
   input.getSchema(),
   signature.fieldAccessDeclarations(),
   fn);
+  doFnSchemaInformation = 
doFnSchemaInformation.

[beam] branch nightly-refs/heads/master updated (79496b2 -> 2f2e4fa)

2021-11-08 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch nightly-refs/heads/master
in repository https://gitbox.apache.org/repos/asf/beam.git.


from 79496b2  Merge pull request #15913: [BEAM-2791] remove spammy log 
statement
 add 2a94534  [BEAM-13157] add regression test for hadoop configuration on 
ParquetIO.Parse
 add 84d57ca  Merge pull request #15914: [BEAM-13157] add regression test 
for hadoop configuration on ParquetIO.Parse
 add 5e44a13  Merge pull request #15873 from [BEAM-13181] Remove Sharding 
from FhirIO.Import
 add 2420585  [BEAM-13155][Playground] add a new status: STATUS_RUN_ERROR; 
add a new API method GetRunError; add a new sunKey RunError; update of 
validation processing;
 add f068a62  Merge pull request #15879 from [BEAM-13155][Playground] 
Update the processing of error during run code
 add cb1d5de  Merge pull request #15813 from [BEAM-13071] [Playground] 
Update run code statuses and add notification system
 add ea1bf40  [BEAM-5172] Temporary ignore testSplit and testSizes tests 
waiting for a fix because they are flaky.
 add 2f2e4fa  Merge pull request #15922 from ibzib/es-flake

No new revisions were added by this update.

Summary of changes:
 playground/api/v1/api.proto|  19 +-
 playground/backend/cmd/server/controller.go|  41 +-
 playground/backend/cmd/server/controller_test.go   | 173 ++--
 playground/backend/internal/api/v1/api.pb.go   | 479 +
 playground/backend/internal/api/v1/api_grpc.pb.go  |  38 ++
 playground/backend/internal/cache/cache.go |   3 +
 .../backend/internal/cache/redis/redis_cache.go|   2 +-
 playground/frontend/assets/error_notification.svg  |  27 ++
 playground/frontend/assets/info_notification.svg   |  27 ++
 .../assets/{reset.svg => success_notification.svg} |   9 +-
 .../frontend/assets/warning_notification.svg   |  27 ++
 playground/frontend/lib/api/v1/api.pb.dart | 106 -
 playground/frontend/lib/api/v1/api.pbenum.dart |   6 +-
 playground/frontend/lib/api/v1/api.pbgrpc.dart |  38 +-
 playground/frontend/lib/api/v1/api.pbjson.dart |  30 +-
 playground/frontend/lib/constants/assets.dart  |   6 +
 playground/frontend/lib/constants/colors.dart  |   8 +
 .../code_client/grpc_code_client.dart  |  16 +-
 .../code_repository/code_repository.dart   |  34 +-
 .../code_repository/run_code_result.dart   |   9 +-
 .../components/base_notification.dart  | 102 +
 .../notifications/components/notification.dart | 117 +
 .../components/editor_textarea_wrapper.dart|   9 +-
 .../code_repository/code_repository_test.dart  |  16 +-
 .../sdk/io/elasticsearch/ElasticsearchIOTest.java  |   3 +
 .../sdk/io/elasticsearch/ElasticsearchIOTest.java  |   3 +
 .../sdk/io/elasticsearch/ElasticsearchIOTest.java  |   3 +
 .../apache/beam/sdk/io/gcp/healthcare/FhirIO.java  | 236 +-
 .../apache/beam/sdk/io/parquet/ParquetIOTest.java  |  28 ++
 29 files changed, 1188 insertions(+), 427 deletions(-)
 create mode 100644 playground/frontend/assets/error_notification.svg
 create mode 100644 playground/frontend/assets/info_notification.svg
 copy playground/frontend/assets/{reset.svg => success_notification.svg} (54%)
 create mode 100644 playground/frontend/assets/warning_notification.svg
 create mode 100644 
playground/frontend/lib/modules/notifications/components/base_notification.dart
 create mode 100644 
playground/frontend/lib/modules/notifications/components/notification.dart