Hi Team,
I've written a flink job & enabled *slf4j* logging mechanism for it.
Flow of* Flink Job :* Kafka source => process datastream
elements(Transformations) => kafka sink.
It stops logging while processing datastream. I want to log all data
captured from kafka either in a log file or on the stdout console.
I've tried a few approaches but am unable to log data.
*Approaches tried : *
1. dataStream.print() or dataStream.print().toString()
2.Iterator<DTO> myOutput = DataStreamUtils.collect(dataStream);
while (myOutput.hasNext()) {
log.info("myOutput" + myOutput.next());
LOG.debug("LOG" + myOutput.next());
}
3.IterativeStream<DTO> iteration = dataStream.iterate();
DataStream<DTO> ge=iteration.map(new MapFunction<DTO, DTO>() {
@Override
public DTO map(DTO obj) throws Exception {
log.info("id "+obj.getID());
return obj;
}
});
4. Store data in a static map & log it before kafka sink etc.
Thanks
Nida