Hi Kishor,
+ dev (d...@beam.apache.org)
This looks like a bug. The attribute statementPreparator is nullable
It should have been handled in the same way as in the expand method of
Read class.
*Thanks & Regards,*
*Vishwas *
On Fri, Aug 2, 2019 at 2:48 PM Kishor Joshi wrote:
> Hi,
>
> I am
Hi Rui,
I was trying out a use case where we have a map with key as string and
value as Row. When we try to access the primitive field in the Row we are
getting below exception.
Caused by: java.lang.NoSuchFieldException: color
at java.lang.Class.getDeclaredField(Class.java:2070)
at
> Does Apache Beam use Confluent Schema Registry client internally ?
>
> Yohei Onishi
>
>
> On Thu, May 9, 2019 at 1:12 PM Vishwas Bm wrote:
>
>> Hi Yohei,
>>
>> I had tried some time back with direct-runner and faced the same issue as
>> in https://git
Hi Yohei,
I had tried some time back with direct-runner and faced the same issue as
in https://github.com/confluentinc/schema-registry/issues/943 when
interacting with TLS enabled Kafka and SchemaRegistry.
So I had set the environment variable JAVA_TOOL_OPTIONS with the required
properties and
gt; small function to do schema conversion for your need.
>
> [1]
> https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/AvroUtils.java#L672
>
>
> Rui
>
>
> On Mon, Apr 15, 2019 at 7:11 PM Vishwas Bm wrote:
>
>&
e to millis before calling
> "AvroUtils.toBeamRowStrict(genericRecord, this.beamSchema)", your exception
> will gone.
>
> -Rui
>
>
> On Mon, Apr 15, 2019 at 10:28 AM Lukasz Cwik wrote:
>
>> +dev
>>
>> On Sun, Apr 14, 2019 at 10:29 PM
Hi,
Below is my pipeline:
KafkaSource (KafkaIO.read) --> Pardo ---> BeamSql
---> KafkaSink(KafkaIO.write)
The avro schema of the topic has a field of logical type timestamp-millis.
KafkaIO.read transform is creating a KafkaRecord,
where this field is being converted
ël
>
> On Thu, Dec 6, 2018 at 12:14 PM Jean-Baptiste Onofré
> wrote:
> >
> > Hi Vishwas
> >
> > Yes, I already started the update.
> >
> > Regards
> > JB
> >
> > On 06/12/2018 07:39, Vishwas Bm wrote:
> > > Hi,
>
Hi,
Currently I see that the spark version dependency used in Beam is "2.3.2".
As spark 2.4 is released now, is there a plan to upgrade Beam spark
dependency ?
*Thanks & Regards,*
*Vishwas *
*Mob : 9164886653*
ort back if this fixes your issue.
>
>
> On Tue, Oct 9, 2018 at 6:45 PM Vishwas Bm wrote:
> >
> > Hi Kenn,
> >
> > We are using Beam 2.6 and using Spark_submit to submit jobs to Spark 2.2
> cluster on Kubernetes.
> >
> >
> > On Tue, Oct 9, 2018,
what version of Beam you are using?
>
> Kenn
>
> On Tue, Oct 9, 2018 at 3:18 AM Vishwas Bm wrote:
>
>> We are trying to setup a pipeline with using BeamSql and the trigger used
>> is default (AfterWatermark crosses the window).
>> Below is the pipeline:
>>
>
We are trying to setup a pipeline with using BeamSql and the trigger used
is default (AfterWatermark crosses the window).
Below is the pipeline:
KafkaSource (KafkaIO) ---> Windowing (FixedWindow 1min) ---> BeamSql
---> KafkaSink (KafkaIO)
We are using Spark Runner for this.
The BeamSql query
be Object, instead of String.
>
> Btw, it might be better to use GenericAvroDeseiralizer or
> SpecificAvroDeserializer from the same package.
>
>
> On Thu, Sep 27, 2018 at 10:31 AM Vishwas Bm wrote:
>
>>
>> Hi Raghu,
>>
>> The deserializer is provided
Hi,
As part of our POC, we are testing Spark runner. We tried to submit a
submit job to a spark(2.2) cluster running on a K8s cluster.
As per our findings during this POC, the beam capability matrix
https://beam.apache.org/documentation/runners/capability-matrix/ is not
updated.
Below features
Hi,
Thanks for the reply. As per the beam capability matrix only
Processing-time triggers is supported by spark runner.
As this page is not updated, what other triggers are supported in beam
spark runner.
*Thanks & Regards,*
*Vishwas *
Hi,
In our use case the watermark is the processing time.
As per beam capability matrix (
https://beam.apache.org/documentation/runners/capability-matrix/) lateness
is not supported by spark runner. But as per the output in our use case we
are able to see late data getting emitted.
So we
16 matches
Mail list logo