Re: Error while deploying from snapshot after adding new column in existing table

2022-01-01 Thread shamit jain
Thanks Martijn! I will check Datastream APIs, if it fits in our use case. Regards, Shamit Jain On Thu, Dec 30, 2021 at 3:44 AM Martijn Visser wrote: > Hi Shamit, > > Yes, there are more possibilities when using the DataStream API like with > the link you've included. You c

Re: Error while deploying from snapshot after adding new column in existing table

2021-12-29 Thread shamit jain
cs-release-1.13/docs/dev/datastream/fault-tolerance/schema_evolution/#evolving-state-schema Regards, Shamit Jain On Wed, Dec 29, 2021 at 5:09 AM Martijn Visser wrote: > Hi Shamit, > > Adding columns means that you're trying to perform schema evolution, which > isn'

Error while deploying from snapshot after adding new column in existing table

2021-12-28 Thread shamit jain
ousTypes, newRowSerializer.types)) { return TypeSerializerSchemaCompatibility.incompatible(); } Can you please help me to understand if we can add a new column in an existing table and deploy from the snapshot? Regards, Shamit Jain

Flink streaming file sink to s 3 cannot recover from failure

2021-10-08 Thread shamit jain
a:127) at org.apache.flink.streaming.api.functions.sink.filesystem.Bucket.restore(Bucket.java:466) at org.apache.flink.streaming.api.functions.sink.filesystem.Def regards, Shamit Jain

Re: Flink UDF Scalar Function called only once for all rows of select SQL in case of no parameter passed

2021-07-14 Thread shamit jain
Thanks!! On 2021/07/14 02:26:47, JING ZHANG wrote: > Hi, Shamit Jain, > In fact, it is an optimization to simplify expression. > If a Udf has no parameters, optimizer would be look it as an expression > which always generate constants results. > So it would be calculated once

Flink UDF Scalar Function called only once for all rows of select SQL in case of no parameter passed

2021-07-13 Thread shamit jain
f employee. Request you to please let me know if I am doing something wrong. regards, Shamit Jain

Re: "upsert-kafka" connector not working with Avro confluent schema registry

2021-02-11 Thread Shamit
er-mailing-list-archive.2336050.n4.nabble.com/file/t2972/Screen_Shot_2021-02-11_at_4.png> Regards, Shamit Jain -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Re: "upsert-kafka" connector not working with Avro confluent schema registry

2021-02-11 Thread Shamit
er-mailing-list-archive.2336050.n4.nabble.com/file/t2972/Screen_Shot_2021-02-11_at_4.png> Regards, Shamit Jain -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Re: "upsert-kafka" connector not working with Avro confluent schema registry

2021-02-09 Thread Shamit
Hello Flink Users, Request you to please help. I am facing issue with "KafkaAvroDeserializer" by using "upsert-kafka" connector. Regards, Shamit Jain -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Join two streams from Kafka

2021-02-09 Thread Shamit
based on some key Please let me know how efficiently I can do. stream2 might have lots of records(in millions). Please help. Regards, Shamit Jain -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

"upsert-kafka" connector not working with Avro confluent schema registry

2021-02-07 Thread Shamit
ache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247) at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) at org.apache.avro.generic.GenericDatu