Re: Query about JdbcIO.readRows()

2019-08-02 Thread Vishwas Bm
Hi Kishor, + dev (d...@beam.apache.org) This looks like a bug. The attribute statementPreparator is nullable It should have been handled in the same way as in the expand method of Read class. *Thanks & Regards,* *Vishwas * On Fri, Aug 2, 2019 at 2:48 PM Kishor Joshi wrote: > Hi, > > I am

Re:

2019-06-24 Thread Vishwas Bm
Hi Rui, I was trying out a use case where we have a map with key as string and value as Row. When we try to access the primitive field in the Row we are getting below exception. Caused by: java.lang.NoSuchFieldException: color at java.lang.Class.getDeclaredField(Class.java:2070) at

Re: How to configure TLS connections for Dataflow job that use Kafka and Schema Registry

2019-05-09 Thread Vishwas Bm
> Does Apache Beam use Confluent Schema Registry client internally ? > > Yohei Onishi > > > On Thu, May 9, 2019 at 1:12 PM Vishwas Bm wrote: > >> Hi Yohei, >> >> I had tried some time back with direct-runner and faced the same issue as >> in https://git

Re: How to configure TLS connections for Dataflow job that use Kafka and Schema Registry

2019-05-08 Thread Vishwas Bm
Hi Yohei, I had tried some time back with direct-runner and faced the same issue as in https://github.com/confluentinc/schema-registry/issues/943 when interacting with TLS enabled Kafka and SchemaRegistry. So I had set the environment variable JAVA_TOOL_OPTIONS with the required properties and

Re: AvroUtils converting generic record to Beam Row causes class cast exception

2019-04-22 Thread Vishwas Bm
gt; small function to do schema conversion for your need. > > [1] > https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/AvroUtils.java#L672 > > > Rui > > > On Mon, Apr 15, 2019 at 7:11 PM Vishwas Bm wrote: > >&

Re: AvroUtils converting generic record to Beam Row causes class cast exception

2019-04-15 Thread Vishwas Bm
e to millis before calling > "AvroUtils.toBeamRowStrict(genericRecord, this.beamSchema)", your exception > will gone. > > -Rui > > > On Mon, Apr 15, 2019 at 10:28 AM Lukasz Cwik wrote: > >> +dev >> >> On Sun, Apr 14, 2019 at 10:29 PM

AvroUtils converting generic record to Beam Row causes class cast exception

2019-04-14 Thread Vishwas Bm
Hi, Below is my pipeline: KafkaSource (KafkaIO.read) --> Pardo ---> BeamSql ---> KafkaSink(KafkaIO.write) The avro schema of the topic has a field of logical type timestamp-millis. KafkaIO.read transform is creating a KafkaRecord, where this field is being converted

Re: Moving to spark 2.4

2018-12-07 Thread Vishwas Bm
ël > > On Thu, Dec 6, 2018 at 12:14 PM Jean-Baptiste Onofré > wrote: > > > > Hi Vishwas > > > > Yes, I already started the update. > > > > Regards > > JB > > > > On 06/12/2018 07:39, Vishwas Bm wrote: > > > Hi, >

Moving to spark 2.4

2018-12-05 Thread Vishwas Bm
Hi, Currently I see that the spark version dependency used in Beam is "2.3.2". As spark 2.4 is released now, is there a plan to upgrade Beam spark dependency ? *Thanks & Regards,* *Vishwas * *Mob : 9164886653*

Re: Issue with GroupByKey in BeamSql using SparkRunner

2018-10-21 Thread Vishwas Bm
ort back if this fixes your issue. > > > On Tue, Oct 9, 2018 at 6:45 PM Vishwas Bm wrote: > > > > Hi Kenn, > > > > We are using Beam 2.6 and using Spark_submit to submit jobs to Spark 2.2 > cluster on Kubernetes. > > > > > > On Tue, Oct 9, 2018,

Re: Issue with GroupByKey in BeamSql using SparkRunner

2018-10-09 Thread Vishwas Bm
what version of Beam you are using? > > Kenn > > On Tue, Oct 9, 2018 at 3:18 AM Vishwas Bm wrote: > >> We are trying to setup a pipeline with using BeamSql and the trigger used >> is default (AfterWatermark crosses the window). >> Below is the pipeline: >> >

Issue with GroupByKey in BeamSql using SparkRunner

2018-10-09 Thread Vishwas Bm
We are trying to setup a pipeline with using BeamSql and the trigger used is default (AfterWatermark crosses the window). Below is the pipeline: KafkaSource (KafkaIO) ---> Windowing (FixedWindow 1min) ---> BeamSql ---> KafkaSink (KafkaIO) We are using Spark Runner for this. The BeamSql query

Re: Kafka Avro Schema Registry Support

2018-09-28 Thread Vishwas Bm
be Object, instead of String. > > Btw, it might be better to use GenericAvroDeseiralizer or > SpecificAvroDeserializer from the same package. > > > On Thu, Sep 27, 2018 at 10:31 AM Vishwas Bm wrote: > >> >> Hi Raghu, >> >> The deserializer is provided

Re: [Discuss] Upgrade story for Beam's execution engines

2018-09-17 Thread Vishwas Bm
Hi, As part of our POC, we are testing Spark runner. We tried to submit a submit job to a spark(2.2) cluster running on a K8s cluster. As per our findings during this POC, the beam capability matrix https://beam.apache.org/documentation/runners/capability-matrix/ is not updated. Below features

Re: Lateness for Spark

2018-09-09 Thread Vishwas Bm
Hi, Thanks for the reply. As per the beam capability matrix only Processing-time triggers is supported by spark runner. As this page is not updated, what other triggers are supported in beam spark runner. *Thanks & Regards,* *Vishwas *

Re: Lateness for Spark

2018-09-07 Thread Vishwas Bm
Hi, In our use case the watermark is the processing time. As per beam capability matrix ( https://beam.apache.org/documentation/runners/capability-matrix/) lateness is not supported by spark runner. But as per the output in our use case we are able to see late data getting emitted. So we