Re: [External] Re: [External] Re:Re: Backpressure issue with Flink Sql Job

2024-07-02 Thread Ashish Khatkar via user
? > > I mean ,if it helps, you can check out > https://www.ververica.com/blog/how-to-write-fast-flink-sql . > > > Regards > > On Tue, Jun 25, 2024 at 4:30 PM Ashish Khatkar via user < > user@flink.apache.org> wrote: > >> Hi Xuyang, >> >> The i

Re: [External] Re:Re: Backpressure issue with Flink Sql Job

2024-06-25 Thread Ashish Khatkar via user
t; - state.backend.rocksdb.writebuffer.size=x > - 3. If possible, try left window join for your streams >- >- Please, share what sink you are using. Also, the per-operator, >source and sink throughput, if possible? > > > On Mon, Jun 24, 2024 at 3

Backpressure issue with Flink Sql Job

2024-06-24 Thread Ashish Khatkar via user
Hi all, We are facing backpressure in the flink sql job from the sink and the backpressure only comes from a single task. This causes the checkpoint to fail despite enabling unaligned checkpoints and using debloating buffers. We enabled flamegraph and the task spends most of the time doing

Re: Needs help debugging an issue

2023-10-23 Thread Ashish Khatkar via user
The additional exceptions with the same error but on different files Pyflink lib error : java.lang.RuntimeException: An error occurred while copying the file. at org.apache.flink.api.common.cache.DistributedCache.getFile( DistributedCache.java:158) at

Needs help debugging an issue

2023-10-23 Thread Ashish Khatkar via user
Hi, We are using flink-1.17.0 table API and RocksDB as backend to provide a service to our users to run sql queries. The tables are created using the avro schema and we also provide users to attach python udf as a plugin. This plugin is downloaded at the time of building the table and we update

Re: [External] Re: Way to add columns with defaults to the existing table and recover from the savepoint

2023-03-21 Thread Ashish Khatkar via user
nk/flink-docs-master/docs/libs/state_processor_api/ > > Best, > Shammon FY > > > On Fri, Mar 17, 2023 at 8:48 PM Ashish Khatkar via user < > user@flink.apache.org> wrote: > >> Hi all, >> >> I need help in understanding if we can add columns with defaults,

Way to add columns with defaults to the existing table and recover from the savepoint

2023-03-17 Thread Ashish Khatkar via user
Hi all, I need help in understanding if we can add columns with defaults, let's say NULL to the existing table and recover the job from the savepoint. We are using flink-1.16.0 table API and RocksDB as backend to provide a service to our users to run sql queries. The tables are created using the