Hi Till,

Many thanks for your quick response.

I have modified the WordCountExample to re-reproduce my problem in a simple 
example.

I run the code below with the following command:
./bin/flink run -m yarn-cluster -yn 1 -ys 4 -yjm 1024 -ytm 1024 -c 
mypackage.WordCountExample ../flinklink.jar

And if I check the log file I see all logger messages except the one in the 
flatMap function of the inner LineSplitter class, which is actually the one I 
am most interested in.

Is that an expected behaviour?

Thanks,
Ana


import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.util.Collector;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.Serializable;
import java.util.ArrayList;
import java.util.List;

public class WordCountExample {
    static Logger logger = LoggerFactory.getLogger(WordCountExample.class);

    public static void main(String[] args) throws Exception {
        final ExecutionEnvironment env = 
ExecutionEnvironment.getExecutionEnvironment();

        logger.info("Entering application.");

    DataSet<String> text = env.fromElements(
                "Who's there?",
                "I think I hear them. Stand, ho! Who's there?");

        List<Integer> elements = new ArrayList<Integer>();
        elements.add(0);


        DataSet<TestClass> set = env.fromElements(new TestClass(elements));

        DataSet<Tuple2<String, Integer>> wordCounts = text
                .flatMap(new LineSplitter())
                .withBroadcastSet(set, "set")
                .groupBy(0)
                .sum(1);

        wordCounts.print();


    }

    public static class LineSplitter implements FlatMapFunction<String, 
Tuple2<String, Integer>> {

        static Logger loggerLineSplitter = 
LoggerFactory.getLogger(LineSplitter.class);

        @Override
        public void flatMap(String line, Collector<Tuple2<String, Integer>> 
out) {
            loggerLineSplitter.info("Logger in LineSplitter.flatMap");
            for (String word : line.split(" ")) {
                out.collect(new Tuple2<String, Integer>(word, 1));
            }
        }
    }

    public static class TestClass implements Serializable {
        private static final long serialVersionUID = -2932037991574118651L;

        static Logger loggerTestClass = 
LoggerFactory.getLogger("WordCountExample.TestClass");

        List<Integer> integerList;
        public TestClass(List<Integer> integerList){
            this.integerList=integerList;
            loggerTestClass.info("Logger in TestClass");
        }


    }
}



On 17 Dec 2015, at 16:08, Till Rohrmann 
<trohrm...@apache.org<mailto:trohrm...@apache.org>> wrote:

Hi Ana,

you can simply modify the `log4j.properties` file in the `conf` directory. It 
should be automatically included in the Yarn application.

Concerning your logging problem, it might be that you have set the logging 
level too high. Could you share the code with us?

Cheers,
Till

On Thu, Dec 17, 2015 at 1:56 PM, Ana M. Martinez 
<a...@cs.aau.dk<mailto:a...@cs.aau.dk>> wrote:
Hi flink community,

I am trying to show log messages using log4j.
It works fine overall except for the messages I want to show in an inner class 
that implements org.apache.flink.api.common.aggregators.ConvergenceCriterion.
I am very new to this, but it seems that I’m having problems to show the 
messages included in the isConverged function, as it runs in the task managers?
E.g. the log messages in the outer class (before map-reduce operations) are 
properly shown.

I am also interested in providing my own log4j.properties file. I am using the 
./bin/flink run -m yarn-cluster on Amazon clusters.

Thanks,
Ana


Reply via email to