Hi, Nida.



I'd like to confirm whether there would be any log output if it's executed 
directly in the IDE. 




If there are logs in the IDE but not when running by submission, you could 
check if the log configuration files in the TM logs are normal. 

If there are no logs in the IDE either, I've encountered this before, and it 
was an issue with jar package conflicts.




--

    Best!
    Xuyang




At 2024-05-15 15:20:45, "Fidea Lidea" <lideafidea...@gmail.com> wrote:

Hi Team,

I've written a flink job & enabled slf4j logging mechanism for it.
Flow ofFlink Job :  Kafka source => process datastream 
elements(Transformations) => kafka sink.

It stops logging while processing datastream. I want to log all data captured 
from kafka either in a log file or on the stdout console.
I've tried a few approaches but am unable to log data.  

Approaches tried : 

1. dataStream.print() or dataStream.print().toString()

2.Iterator<DTO> myOutput = DataStreamUtils.collect(dataStream);
while (myOutput.hasNext()) {
log.info("myOutput" + myOutput.next());
LOG.debug("LOG" + myOutput.next());
}
3.IterativeStream<DTO> iteration = dataStream.iterate();

DataStream<DTO> ge=iteration.map(new MapFunction<DTO, DTO>() {
@Override
public DTO map(DTO obj) throws Exception {
log.info("id  "+obj.getID());
return obj;
    }


});
4. Store data in a static map & log it before kafka sink etc.

Thanks
Nida



Reply via email to