Hello,

I need one counter matric for no. of corrupt records while decoding parquet 
records at data source level. I know where the corrupt record handling requires 
but due to non-existence of “SourceContext” or “RuntimeContext”, unable to do 
anything w.r.t metric.

It is needed similarly the way “SourceReaderBase” class maintaining one counter 
for no. of records emitted.

Rgds,
Kamal

From: Shammon FY <zjur...@gmail.com>
Sent: 14 June 2023 05:33 PM
To: Kamal Mittal <kamal.mit...@ericsson.com>
Cc: user@flink.apache.org
Subject: Re: Flink bulk and record file source format metrices

Hi Kamal,

Can you give more information about the metris you want? In Flink each source 
task has one source reader which already has some metrics, you can refer to 
metrics doc[1] for more detailed information.

[1] https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/

Best,
Shammon FY

On Tue, Jun 13, 2023 at 11:13 AM Kamal Mittal via user 
<user@flink.apache.org<mailto:user@flink.apache.org>> wrote:
Hello,

Using Flink record stream format file source API as below for parquet records 
reading.

FileSource.FileSourceBuilder<?> source = 
FileSource.forRecordStreamFormat(streamformat, path);
source.monitorContinuously(Duration.ofMillis(10000));

Want to log/generate metrices for corrupt records and for the same need to log 
flink metrices at source level in parquet reader class, is there any way to do 
that as right now no handle for SourceContext available?

Rgds,
Kamal

Reply via email to