Re: Random outputs for ARRAY_CONCAT_AGG fn zetasql

2021-03-03 Thread Sonam Ramchand
Yeah, that sounds right. But the only thing that confuses me is:

PAssert.that(stream).satisfies(row ->
assertThat("array_agg_concat_field", actual ,
containsInAnyOrder(Arrays.asList(1L,2L,3L,4L,5L,6L;

How come I can access *actual* here when output is not materialized.

On Tue, Mar 2, 2021 at 10:49 PM Kyle Weaver  wrote:

> As you can see from existing tests, Beam doesn't materialize the output
> array directly. Instead you must use the PAssert API. I agree with Tyson's
> suggestion to use `satisfies`, which lets you do arbitrary assertions on
> the output data.
>
> On Tue, Mar 2, 2021 at 3:57 AM Sonam Ramchand <
> sonam.ramch...@venturedive.com> wrote:
>
>> Is there any way I can access the output array resulting from the sql
>> query? Then maybe I can sort and compare both *output array* and *expected
>> output array *for the test to pass.
>>
>> On Tue, Mar 2, 2021 at 12:24 AM Kenneth Knowles  wrote:
>>
>>> Yea, the reason is that SQL relations are not ordered. So any ordering
>>> of [1, 2, 3, 4] and [5, 6] and [7, 8, 9] is possible and correct.
>>>
>>> Kenn
>>>
>>> On Mon, Mar 1, 2021 at 11:01 AM Tyson Hamilton 
>>> wrote:
>>>
>>>> I didn't find anything like that after a brief look. What you could do
>>>> instead is something like:
>>>>
>>>> PAssert.thatSingleton(stream).satisfies( row -> assertThat("array_field
>>>> containsInAnyOrder", row.getArray("array_field"),
>>>> containsInAnyOrder(Arrays.asList(...)));
>>>>
>>>> using junit/hamcrest matchers. I didn't verify this works myself but it
>>>> should give you an idea for some next steps.
>>>>
>>>>
>>>> On Mon, Mar 1, 2021 at 12:37 AM Sonam Ramchand <
>>>> sonam.ramch...@venturedive.com> wrote:
>>>>
>>>>> Hi Devs,
>>>>> I have implemented the ARRAY_CONCAT_AGG function for zetasql dialect.
>>>>> I am trying to validate the test as:
>>>>>
>>>>> @Test
>>>>> public void testArrayConcatAggZetasql() {
>>>>>   String sql =
>>>>>   "SELECT ARRAY_CONCAT_AGG(x) AS array_concat_agg FROM (SELECT [1, 2, 
>>>>> 3, 4] AS x UNION ALL SELECT [5, 6] UNION ALL SELECT [7, 8, 9])";
>>>>>
>>>>>   ZetaSQLQueryPlanner zetaSQLQueryPlanner = new 
>>>>> ZetaSQLQueryPlanner(config);
>>>>>   BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
>>>>>   PCollection stream = BeamSqlRelUtils.toPCollection(pipeline, 
>>>>> beamRelNode);
>>>>>
>>>>>   Schema schema = Schema.builder().addArrayField("array_field", 
>>>>> FieldType.INT64).build();
>>>>>   PAssert.that(stream)
>>>>>   .containsInAnyOrder(
>>>>>   Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 
>>>>> 9L).build());
>>>>>   
>>>>> pipeline.run().waitUntilFinish(Duration.standardMinutes(PIPELINE_EXECUTION_WAITTIME_MINUTES));
>>>>> }
>>>>>
>>>>> Expected Output is: 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L.
>>>>> But I am getting randomly different outputs:
>>>>> 1. 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L
>>>>> 2. 5L, 6L, 7L, 8L, 9L, 1L, 2L, 3L, 4L
>>>>> 3. 7L, 8L, 9L, 5L, 6L, 1L, 2L, 3L, 4L
>>>>>
>>>>> As per my understanding, it is because of containsInAnyOrder function.
>>>>> Is there anything Like:
>>>>>
>>>>>PAssert.that(stream)
>>>>> .containsAnyOfThem(
>>>>> Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L, 7L, 
>>>>> 8L, 9L).build(),
>>>>> Row.withSchema(schema).addArray(5L, 6L, 7L, 8L, 9L, 1L, 
>>>>> 2L, 3L, 4L).build(),
>>>>> Row.withSchema(schema).addArray(7L, 8L, 9L, 5L, 6L, 1L, 
>>>>> 2L, 3L, 4L).build());
>>>>>
>>>>> I would really appreciate if anyone can help me in knowing how to
>>>>> handle such scenario in Beam.
>>>>>
>>>>> Thanks!
>>>>> --
>>>>> Regards,
>>>>> *Sonam*
>>>>> Software Engineer
>>>>> Mobile: +92 3088337296 <+92%20308%208337296>
>>>>>
>>>>> <http://venturedive.com/>
>>>>>
>>>>
>>
>> --
>>
>> Regards,
>> *Sonam*
>> Software Engineer
>> Mobile: +92 3088337296 <+92%20308%208337296>
>>
>> <http://venturedive.com/>
>>
>

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Re: Random outputs for ARRAY_CONCAT_AGG fn zetasql

2021-03-02 Thread Sonam Ramchand
Is there any way I can access the output array resulting from the sql
query? Then maybe I can sort and compare both *output array* and *expected
output array *for the test to pass.

On Tue, Mar 2, 2021 at 12:24 AM Kenneth Knowles  wrote:

> Yea, the reason is that SQL relations are not ordered. So any ordering of
> [1, 2, 3, 4] and [5, 6] and [7, 8, 9] is possible and correct.
>
> Kenn
>
> On Mon, Mar 1, 2021 at 11:01 AM Tyson Hamilton  wrote:
>
>> I didn't find anything like that after a brief look. What you could do
>> instead is something like:
>>
>> PAssert.thatSingleton(stream).satisfies( row -> assertThat("array_field
>> containsInAnyOrder", row.getArray("array_field"),
>> containsInAnyOrder(Arrays.asList(...)));
>>
>> using junit/hamcrest matchers. I didn't verify this works myself but it
>> should give you an idea for some next steps.
>>
>>
>> On Mon, Mar 1, 2021 at 12:37 AM Sonam Ramchand <
>> sonam.ramch...@venturedive.com> wrote:
>>
>>> Hi Devs,
>>> I have implemented the ARRAY_CONCAT_AGG function for zetasql dialect. I
>>> am trying to validate the test as:
>>>
>>> @Test
>>> public void testArrayConcatAggZetasql() {
>>>   String sql =
>>>   "SELECT ARRAY_CONCAT_AGG(x) AS array_concat_agg FROM (SELECT [1, 2, 
>>> 3, 4] AS x UNION ALL SELECT [5, 6] UNION ALL SELECT [7, 8, 9])";
>>>
>>>   ZetaSQLQueryPlanner zetaSQLQueryPlanner = new ZetaSQLQueryPlanner(config);
>>>   BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
>>>   PCollection stream = BeamSqlRelUtils.toPCollection(pipeline, 
>>> beamRelNode);
>>>
>>>   Schema schema = Schema.builder().addArrayField("array_field", 
>>> FieldType.INT64).build();
>>>   PAssert.that(stream)
>>>   .containsInAnyOrder(
>>>   Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 
>>> 9L).build());
>>>   
>>> pipeline.run().waitUntilFinish(Duration.standardMinutes(PIPELINE_EXECUTION_WAITTIME_MINUTES));
>>> }
>>>
>>> Expected Output is: 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L.
>>> But I am getting randomly different outputs:
>>> 1. 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L
>>> 2. 5L, 6L, 7L, 8L, 9L, 1L, 2L, 3L, 4L
>>> 3. 7L, 8L, 9L, 5L, 6L, 1L, 2L, 3L, 4L
>>>
>>> As per my understanding, it is because of containsInAnyOrder function.
>>> Is there anything Like:
>>>
>>>PAssert.that(stream)
>>> .containsAnyOfThem(
>>> Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 
>>> 9L).build(),
>>> Row.withSchema(schema).addArray(5L, 6L, 7L, 8L, 9L, 1L, 2L, 
>>> 3L, 4L).build(),
>>> Row.withSchema(schema).addArray(7L, 8L, 9L, 5L, 6L, 1L, 2L, 
>>> 3L, 4L).build());
>>>
>>> I would really appreciate if anyone can help me in knowing how to handle
>>> such scenario in Beam.
>>>
>>> Thanks!
>>> --
>>> Regards,
>>> *Sonam*
>>> Software Engineer
>>> Mobile: +92 3088337296 <+92%20308%208337296>
>>>
>>> <http://venturedive.com/>
>>>
>>

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Random outputs for ARRAY_CONCAT_AGG fn zetasql

2021-03-01 Thread Sonam Ramchand
Hi Devs,
I have implemented the ARRAY_CONCAT_AGG function for zetasql dialect. I am
trying to validate the test as:

@Test
public void testArrayConcatAggZetasql() {
  String sql =
  "SELECT ARRAY_CONCAT_AGG(x) AS array_concat_agg FROM (SELECT [1,
2, 3, 4] AS x UNION ALL SELECT [5, 6] UNION ALL SELECT [7, 8, 9])";

  ZetaSQLQueryPlanner zetaSQLQueryPlanner = new ZetaSQLQueryPlanner(config);
  BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
  PCollection stream = BeamSqlRelUtils.toPCollection(pipeline,
beamRelNode);

  Schema schema = Schema.builder().addArrayField("array_field",
FieldType.INT64).build();
  PAssert.that(stream)
  .containsInAnyOrder(
  Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L, 7L,
8L, 9L).build());
  
pipeline.run().waitUntilFinish(Duration.standardMinutes(PIPELINE_EXECUTION_WAITTIME_MINUTES));
}

Expected Output is: 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L.
But I am getting randomly different outputs:
1. 1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L
2. 5L, 6L, 7L, 8L, 9L, 1L, 2L, 3L, 4L
3. 7L, 8L, 9L, 5L, 6L, 1L, 2L, 3L, 4L

As per my understanding, it is because of containsInAnyOrder function. Is
there anything Like:

   PAssert.that(stream)
.containsAnyOfThem(
Row.withSchema(schema).addArray(1L, 2L, 3L, 4L, 5L, 6L,
7L, 8L, 9L).build(),
Row.withSchema(schema).addArray(5L, 6L, 7L, 8L, 9L,
1L, 2L, 3L, 4L).build(),
Row.withSchema(schema).addArray(7L, 8L, 9L, 5L, 6L,
1L, 2L, 3L, 4L).build());

I would really appreciate if anyone can help me in knowing how to handle
such scenario in Beam.

Thanks!
-- 
Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




COVAR_POP aggregate function test for the ZetaSql dialec

2021-02-09 Thread Sonam Ramchand
Hi Devs,
I am trying to test the COVAR_POP aggregate function for the ZetaSql
dialect. I see
https://github.com/apache/beam/blob/b74fcf7b30d956fb42830d652a57b265a1546973/sdks/[…]he/beam/sdk/extensions/sql/impl/transform/agg/CovarianceFn.java

is
implemented as CombineFn and it works correctly
https://github.com/apache/beam/blob/befcc3d780d561e81f23512742862a65c0ae3b69/sdks/[…]eam/sdk/extensions/sql/BeamSqlDslAggregationCovarianceTest.java

.However, for ZetaSql dialect, it throws:
covar_pop has more than one argument.
java.lang.IllegalArgumentException: covar_pop has more than one argument.Unit
test:

public void testZetaSqlCovarPop() {
  String sql = "SELECT COVAR_POP(row_id,int64_col) FROM
table_all_types  GROUP BY bool_col";

  ZetaSQLQueryPlanner zetaSQLQueryPlanner = new ZetaSQLQueryPlanner(config);
  BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
  PCollection stream = BeamSqlRelUtils.toPCollection(pipeline,
beamRelNode);

  final Schema schema = Schema.builder().addDoubleField("field1").build();
  PAssert.that(stream)
  .containsInAnyOrder(
  Row.withSchema(schema).addValue(-1.0).build(),
  Row.withSchema(schema).addValue(-1.6).build());

  
pipeline.run().waitUntilFinish(Duration.standardMinutes(PIPELINE_EXECUTION_WAITTIME_MINUTES));
}

Can anybody help me in understanding the cause of this problem? I do not
understand how it works correctly in other places and not in ZetaSqlDialect.
For reference: https://github.com/apache/beam/pull/13915

I would really appreciate any sort of input on this.
-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Need help with the Go Sdk

2020-12-28 Thread Sonam Ramchand
Hi Devs,
For ':sdks:go:resolveBuildDependencies' task, I have been getting:

Exception in resolution, message is:
  Cannot resolve dependency:github.com/ajstarks/deck:
commit='LATEST_COMMIT', urls=[https://github.com/ajstarks/deck.git,
g...@github.com:ajstarks/deck.git]
  Resolution stack is:
  +- github.com/apache/beam/sdks/go
   +- golang.org/x/net#6772e930b67bb09bf22262c7378e7d2f67cf59d1
+- golang.org/x/build#0a4bf693f6139da99647cdcccd3fd0b8a6fbfd70
 +- golang.org/x/perf#bdcc6220ee906f6e3759ef5784cb0bf9e60aec1e
  +- github.com/aclements/go-gg#abd1f791f5ee99465ee7cffe771436379d6cee5a
   +- github.com/ajstarks/svgo#7a3c8b57fecb7bee36eee443630bb30bffbb37fc

>From my understanding, I need to add/delete these dependencies somewhere,
being unfamiliar with the Go language, I do not have an idea exactly where
these dependencies should be added.

Can you please help with that? If my understanding is not right, please
correct me.
Thanks in advance.
-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Problem with :sdks:java:container:pullLicenses

2020-12-16 Thread Sonam Ramchand
Hi All!!
For ./gradlew :sdks:java:container:pullLicenses , i have been getting:
Process 'command './license_scripts/license_script.sh'' finished with
non-zero exit value 1
The issue is closely related to
https://issues.apache.org/jira/browse/BEAM-9913.
If anyone has an idea about this issue? Please help :)
-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Re: "org.apache.kafka:kafka-clients:5.3.2-ccs" dependency issue.

2020-12-14 Thread Sonam Ramchand
Thanks for your quick response. But, updating updating kafka-clients:1.0.0
to 2.4.1 on the PR does not resolve the issue sadly. Seems like there is
some other issue.

On Tue, Dec 15, 2020 at 2:03 AM Kyle Weaver  wrote:

> We recently upgraded kafka and kafka-clients to version 2.4.1 [1]. It
> looks like there are a couple places in your PR that use the old version
> kafka-clients:1.0.0 [2]. You will need to update your PR to use version
> 2.4.1 instead.
>
> [1]
> https://github.com/apache/beam/commit/8e6dae8105c7d8abaabf71f6529c604884c879d3
> [2] https://github.com/apache/beam/pull/12938
>
> On Mon, Dec 14, 2020 at 12:24 PM Sonam Ramchand <
> sonam.ramch...@venturedive.com> wrote:
>
>> Please refer to the link to understand the problem better
>> https://gradle.com/s/zaqcnvh2uiwga.
>>
>> Thanks!
>>
>> On Tue, Dec 15, 2020 at 1:03 AM Sonam Ramchand <
>> sonam.ramch...@venturedive.com> wrote:
>>
>>> Hi Devs,
>>> I have been getting:Could not resolve all dependencies for
>>> configuration ':sdks:java:container:dockerDependency'.
>>> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
>>> Searched in the following locations:
>>> -
>>> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> Required by:
>>> project :sdks:java:container
>>> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
>>> Searched in the following locations:
>>> -
>>> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> Required by:
>>> project :sdks:java:container > project :sdks:java:io:kafka
>>> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
>>> Searched in the following locations:
>>> -
>>> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> -
>>> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
>>> Required by:
>>> project :sdks:java:container > project :sdks:java:io:kafka
>>

Re: "org.apache.kafka:kafka-clients:5.3.2-ccs" dependency issue.

2020-12-14 Thread Sonam Ramchand
Please refer to the link to understand the problem better
https://gradle.com/s/zaqcnvh2uiwga.

Thanks!

On Tue, Dec 15, 2020 at 1:03 AM Sonam Ramchand <
sonam.ramch...@venturedive.com> wrote:

> Hi Devs,
> I have been getting:Could not resolve all dependencies for configuration
> ':sdks:java:container:dockerDependency'.
> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
> Searched in the following locations:
> -
> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> Required by:
> project :sdks:java:container
> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
> Searched in the following locations:
> -
> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> Required by:
> project :sdks:java:container > project :sdks:java:io:kafka
> > Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
> Searched in the following locations:
> -
> file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> -
> https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
> Required by:
> project :sdks:java:container > project :sdks:java:io:kafka
> project :sdks:java:container > project :sdks:java:io:kafka >
> io.confluent:kafka-schema-registry-client:5.3.2Even after I add compile
> "org.apache.kafka:kafka-clients:5.3.2-ccs" to sdks:java:io:kafka
> build.gradle file, i get no luck.Any sort of quick help will be really
> appreciated.
>
> --
>
> Regards,
> *Sonam*
> Software Engineer
> Mobile: +92 3088337296
>
> <http://venturedive.com/>
>


-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


"org.apache.kafka:kafka-clients:5.3.2-ccs" dependency issue.

2020-12-14 Thread Sonam Ramchand
Hi Devs,
I have been getting:Could not resolve all dependencies for configuration
':sdks:java:container:dockerDependency'.
> Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
Searched in the following locations:
-
file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
Required by:
project :sdks:java:container
> Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
Searched in the following locations:
-
file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
Required by:
project :sdks:java:container > project :sdks:java:io:kafka
> Could not find org.apache.kafka:kafka-clients:5.3.2-ccs.
Searched in the following locations:
-
file:/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Python_PVR_Flink_Phrase/src/sdks/java/container/offline-repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repo.maven.apache.org/maven2/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
file:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://jcenter.bintray.com/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://oss.sonatype.org/content/repositories/staging/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/snapshots/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
-
https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka-clients/5.3.2-ccs/kafka-clients-5.3.2-ccs.pom
Required by:
project :sdks:java:container > project :sdks:java:io:kafka
project :sdks:java:container > project :sdks:java:io:kafka >
io.confluent:kafka-schema-registry-client:5.3.2Even after I add compile
"org.apache.kafka:kafka-clients:5.3.2-ccs" to sdks:java:io:kafka
build.gradle file, i get no luck.Any sort of quick help will be really
appreciated.

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Implementing ARR_AGG

2020-12-07 Thread Sonam Ramchand
Hi Devs,
I have tried to implement the ARR_AGG function for Zetasql dialect by
following the STRING_AGG implementation (
https://github.com/apache/beam/pull/11895).
Draft PR for ARR_AGG is (https://github.com/apache/beam/pull/13483). When i
try to run the test,

@Test
public void testArrayAggregation() {
  String sql =
  "SELECT ARRAY_AGG(x) AS array_agg\n" +
  "FROM UNNEST([2, 1, -2, 3, -2, 1, 2]) AS x";
ZetaSQLQueryPlanner zetaSQLQueryPlanner = new
ZetaSQLQueryPlanner(config);
  BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
  PCollection stream = BeamSqlRelUtils.toPCollection(pipeline,
beamRelNode);  Schema schema =
Schema.builder().addArrayField("array_field",
FieldType.of(Schema.TypeName.ARRAY)).build();
  PAssert.that(stream)
  .containsInAnyOrder(Row.withSchema(schema).addArray(2, 1,
-2, 3, -2, 1, 2).build());
pipeline.run().waitUntilFinish(Duration.standardMinutes(PIPELINE_EXECUTION_WAITTIME_MINUTES));
}

I am getting an error,
type mismatch:
aggCall type:
BIGINT NOT NULL ARRAY NOT NULL
inferred type:
ARRAY NOT NULL
java.lang.AssertionError: type mismatch:
aggCall type:
BIGINT NOT NULL ARRAY NOT NULL
inferred type:
ARRAY NOT NULL at org.apache.beam.vendor.calcite.v1_20_0.org
.apache.calcite.util.Litmus$1.fail(Litmus.java:31)
at org.apache.beam.vendor.calcite.v1_20_0.org
.apache.calcite.plan.RelOptUtil.eq(RelOptUtil.java:1958)
at org.apache.beam.vendor.calcite.v1_20_0.org
.apache.calcite.rel.core.Aggregate.typeMatchesInferred(Aggregate.java:434)
at org.apache.beam.vendor.calcite.v1_20_0.org
.apache.calcite.rel.core.Aggregate.(Aggregate.java:159)
at org.apache.beam.vendor.calcite.v1_20_0.org
.apache.calcite.rel.logical.LogicalAggregate.(LogicalAggregate.java:65)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.AggregateScanConverter.convert(AggregateScanConverter.java:113)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.AggregateScanConverter.convert(AggregateScanConverter.java:50)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convertNode(QueryStatementConverter.java:102)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Collections$2.tryAdvance(Collections.java:4719)
at java.util.Collections$2.forEachRemaining(Collections.java:4727)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convertNode(QueryStatementConverter.java:101)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convert(QueryStatementConverter.java:89)
at
org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convertRootQuery(QueryStatementConverter.java:55)
at
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLPlannerImpl.rel(ZetaSQLPlannerImpl.java:141)
at
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner.convertToBeamRelInternal(ZetaSQLQueryPlanner.java:180)
at
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner.convertToBeamRel(ZetaSQLQueryPlanner.java:168)
at
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner.convertToBeamRel(ZetaSQLQueryPlanner.java:152)
at
org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlDialectSpecTest.testArrayAggregation(ZetaSqlDialectSpecTest.java:4071)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at
org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:322)
at
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:266)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305)
at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:365)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:3

Query regarding Array_Agg impl

2020-11-25 Thread Sonam Ramchand
Hi Devs,
I am trying to implement Array_Agg(
https://cloud.google.com/bigquery/docs/reference/standard-sql/aggregate_functions#array_agg
) for Beam SQL ZetaSQL dialect, as CombineFn.

Rough Pseudocode:
* public static class ArrayAgg extends CombineFn { //todo }*

But then I came to know we cannot use generics for UDAFs as per
https://issues.apache.org/jira/browse/BEAM-11059, is there any alternative?

Thanks!
-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Re: Gradle Build issue

2020-11-25 Thread Sonam Ramchand
Thanks Michal!
It helped.

On Wed, Nov 25, 2020 at 1:23 PM Michał Walenia 
wrote:

> Hi,
> are you using your local installation of Gradle or the wrapper supplied in
> the repo? If you're running the task with `gradle` command, try using
> `./gradlew` instead, this will use the wrapper.
>
> Have a good day,
> Michal
>
>
> On Wed, Nov 25, 2020 at 8:44 AM Sonam Ramchand <
> sonam.ramch...@venturedive.com> wrote:
>
>> Hi Devs,
>> When I run the "Gradle build" command after I take the latest pull, I
>> have been getting the following error.
>>
>> FAILURE: Build failed with an exception.
>>
>> * Where:
>> *Build file '/home/vend/ApacheBeam/beam/buildSrc/build.gradle' line: 23*
>>
>>
>>
>>
>> ** What went wrong:An exception occurred applying plugin request [id:
>> 'com.diffplug.spotless', version: '5.6.1']> Failed to apply plugin [id
>> 'com.diffplug.spotless']   > Spotless requires Gradle 5.4 or newer, this
>> was 4.4.1*
>>
>> *Any idea about this?*
>>
>>
>> *Thank you!*--
>>
>> Regards,
>> *Sonam*
>> Software Engineer
>> Mobile: +92 3088337296
>>
>> <http://venturedive.com/>
>>
>
>
> --
>
> Michał Walenia
> Polidea <https://www.polidea.com/> | Software Engineer
>
> M: +48 791 432 002 <+48791432002>
> E: michal.wale...@polidea.com
>
> Unique Tech
> Check out our projects! <https://www.polidea.com/our-work>
>


-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Gradle Build issue

2020-11-24 Thread Sonam Ramchand
Hi Devs,
When I run the "Gradle build" command after I take the latest pull, I have
been getting the following error.

FAILURE: Build failed with an exception.

* Where:
*Build file '/home/vend/ApacheBeam/beam/buildSrc/build.gradle' line: 23*




** What went wrong:An exception occurred applying plugin request [id:
'com.diffplug.spotless', version: '5.6.1']> Failed to apply plugin [id
'com.diffplug.spotless']   > Spotless requires Gradle 5.4 or newer, this
was 4.4.1*

*Any idea about this?*


*Thank you!*--

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Re: Question about LOGICAL_AND

2020-11-17 Thread Sonam Ramchand
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)

org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlDialectSpecTest >
testLogicalAndZetaSQL FAILED
java.lang.ClassCastException at ZetaSqlDialectSpecTest.java:4334
1 test completed, 1 failed
> Task :sdks:java:extensions:sql:zetasql:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:zetasql:test'.
> There were failing tests. See the report at:
file:///home/vend/ApacheBeam/beam/sdks/java/extensions/sql/zetasql/build/reports/tests/test/index.html

Do you have any idea why this is happening?


On Fri, Nov 13, 2020 at 11:42 PM Robin Qiu  wrote:

> Oh I see. Thanks for the clarification, Kenn! Yeah the CombineFn
> LOGICAL_AND is to be implemented.
>
> On Fri, Nov 13, 2020 at 10:00 AM Kenneth Knowles  wrote:
>
>> Some clarification: LOGICAL_AND is a ZetaSQL/BigQuery aggregate function:
>> https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators#logical_and
>>
>> So it needs to be implemented as a CombineFn. Here are some example PRs
>> that do similar things: https://github.com/apache/beam/pulls?q=BIT_OR
>>
>> Kenn
>>
>> On Thu, Nov 12, 2020 at 12:48 PM Rui Wang  wrote:
>>
>>> Or the question is, which BeamSQL  dialect you are using?
>>>
>>>
>>> -Rui
>>>
>>> On Thu, Nov 12, 2020 at 12:41 PM Robin Qiu  wrote:
>>>
>>>> Hi Sonam, AND operator is already defined by Calcite and the mapping is
>>>> here:
>>>> https://github.com/apache/beam/blob/816017e44e3209d334f4f3b2bc3fa829663c530e/sdks/java/extensions/sql/zetasql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SqlOperatorMappingTable.java#L39.
>>>> Is this what you are looking for?
>>>>
>>>> On Thu, Nov 12, 2020 at 12:00 PM Kyle Weaver 
>>>> wrote:
>>>>
>>>>> If you're defining a new built-in function in ZetaSQL, you can define
>>>>> an operator for it here:
>>>>> https://github.com/apache/beam/blob/master/sdks/java/extensions/sql/zetasql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SqlOperators.java
>>>>>
>>>>> Then add the operator the table here:
>>>>> https://github.com/apache/beam/blob/master/sdks/java/extensions/sql/zetasql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SqlOperatorMappingTable.java
>>>>>
>>>>> On Thu, Nov 12, 2020 at 11:14 AM Sonam Ramchand <
>>>>> sonam.ramch...@venturedive.com> wrote:
>>>>>
>>>>>> There is no LOGICAL_AND operator in SqlStdOperatorTable, is there any
>>>>>> other way to implement LOGICAL_AND?
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Regards,
>>>>>> *Sonam*
>>>>>> Software Engineer
>>>>>> Mobile: +92 3088337296 <+92%20308%208337296>
>>>>>>
>>>>>> <http://venturedive.com/>
>>>>>>
>>>>>

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Re: Question about LOGICAL_AND

2020-11-17 Thread Sonam Ramchand
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)

org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlDialectSpecTest >
testLogicalAndZetaSQL FAILED
java.lang.ClassCastException at ZetaSqlDialectSpecTest.java:4334
1 test completed, 1 failed
> Task :sdks:java:extensions:sql:zetasql:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:zetasql:test'.
> There were failing tests. See the report at:
file:///home/vend/ApacheBeam/beam/sdks/java/extensions/sql/zetasql/build/reports/tests/test/index.html

Do you have any idea why this is happening?

On Fri, Nov 13, 2020 at 1:00 AM Kyle Weaver  wrote:

> If you're defining a new built-in function in ZetaSQL, you can define an
> operator for it here:
> https://github.com/apache/beam/blob/master/sdks/java/extensions/sql/zetasql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SqlOperators.java
>
> Then add the operator the table here:
> https://github.com/apache/beam/blob/master/sdks/java/extensions/sql/zetasql/src/main/java/org/apache/beam/sdk/extensions/sql/zetasql/translation/SqlOperatorMappingTable.java
>
> On Thu, Nov 12, 2020 at 11:14 AM Sonam Ramchand <
> sonam.ramch...@venturedive.com> wrote:
>
>> There is no LOGICAL_AND operator in SqlStdOperatorTable, is there any
>> other way to implement LOGICAL_AND?
>>
>> --
>>
>> Regards,
>> *Sonam*
>> Software Engineer
>> Mobile: +92 3088337296 <+92%20308%208337296>
>>
>> <http://venturedive.com/>
>>
>

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Re: Getting ClassCastException

2020-11-16 Thread Sonam Ramchand
 java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)

org.apache.beam.sdk.extensions.sql.zetasql.ZetaSqlDialectSpecTest >
testLogicalAndZetaSQL FAILED
java.lang.ClassCastException at ZetaSqlDialectSpecTest.java:4334
1 test completed, 1 failed
> Task :sdks:java:extensions:sql:zetasql:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:zetasql:test'.
> There were failing tests. See the report at:
file:///home/vend/ApacheBeam/beam/sdks/java/extensions/sql/zetasql/build/reports/tests/test/index.html

On Fri, Nov 13, 2020 at 8:47 PM Tomo Suzuki  wrote:

> Hi Sonam,
>
> Do you want to share the stack trace of the error? It would help which
> line it hit the error and the function calls.
>
> On Fri, Nov 13, 2020 at 6:40 AM Sonam Ramchand <
> sonam.ramch...@venturedive.com> wrote:
>
>> Hi Devs,
>>
>> I am trying to implement LOGICAL_AND  functions for Beam SQL ZetaSQL
>> dialect, as CombineFn.
>>
>> https://github.com/sonam-vend/beam/commit/9ad8ee1d8fa617aca7fcafc8e7efe8bf388b3afb
>>
>> When I try to execute testLogicalAndZetaSQL, I am getting
>> org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators$1
>> cannot be cast to
>> org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAggFunction
>> java.lang.ClassCastException:
>> org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators$1
>> cannot be cast to
>> org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAggFunction
>>
>> I cannot understand the reason behind occurrence of this exception.
>>
>> Any sort of help would be appreciated/
>>
>> Regards,
>> *Sonam*
>> Software Engineer
>> Mobile: +92 3088337296 <+92%20308%208337296>
>>
>> <http://venturedive.com/>
>>
>
>
> --
> Regards,
> Tomo
>


-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296

<http://venturedive.com/>


Getting ClassCastException

2020-11-13 Thread Sonam Ramchand
Hi Devs,

I am trying to implement LOGICAL_AND  functions for Beam SQL ZetaSQL
dialect, as CombineFn.
https://github.com/sonam-vend/beam/commit/9ad8ee1d8fa617aca7fcafc8e7efe8bf388b3afb

When I try to execute testLogicalAndZetaSQL, I am getting
org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators$1
cannot be cast to
org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAggFunction
java.lang.ClassCastException:
org.apache.beam.sdk.extensions.sql.zetasql.translation.SqlOperators$1
cannot be cast to
org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.SqlAggFunction

I cannot understand the reason behind occurrence of this exception.

Any sort of help would be appreciated/

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Question about LOGICAL_AND

2020-11-12 Thread Sonam Ramchand
There is no LOGICAL_AND operator in SqlStdOperatorTable, is there any other
way to implement LOGICAL_AND?

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296




Problem being encountered while running the Query with COUNTIF function

2020-11-12 Thread Sonam Ramchand
Hi  Devs,

I am trying to implement the COUNTIF  function for Beam SQL ZetaSQL
dialect, as CombineFn.

Where I try to run the Test query (SELECT COUNTIF(f_long > 0) AS countif_no
FROM PCOLLECTION GROUP BY f_int2), I am getting Exception
Caused by:
org.apache.beam.vendor.calcite.v1_20_0.org.apache.calcite.sql.validate.SqlValidatorException:
No match found for function signature COUNTIF().

Can anybody help me in knowing the reason behind this?

Thank you

-- 

Regards,
*Sonam*
Software Engineer
Mobile: +92 3088337296