Hi Sateesh, Note that there are three classes of log files, when running Flink on EMR:
1. The output from the main class. Since I typically run the job by sshing onto the master and using the CLI from there, I have control over where that output goes. E.g. nohup bin/flink run -m yarn-cluster -yn 48 /path/to/my-job.jar >> my.log 2>&1 & And then: tail -f my.log 2. Logging by the JobManager The JobManager log is available via the Yarn Application Overview screen (see the Logs link in the attempt list near the bottom). When your tool fails (e.g., due to a missing command-line argument), the error output is available via the stderr link in that Step of the EMR Cluster > Steps tab 3. Logging by each TaskManager I typically log into the slave to have a look at the Task Manager error/status output (e.g., in /var/log/hadoop-yarn/containers/application_1546471484145_0002/container_1546471484145_0002_01_000002/taskmanager.err). One common approach here is to grep the taskmanager.log files (on each slave), e.g. sudo find /var/log/hadoop-yarn/containers/application_1568579660214_0004/ -name "taskmanager.log" | sudo xargs grep “text of interest” HTH, — Ken > On Jul 2, 2020, at 9:29 AM, mars <sk_ac...@yahoo.com> wrote: > > Hi, > > I am running my Flink jobs on EMR and i didn't include any > log4j.properties as part of my JAR and i am using slf4j (and included the > dependent jars in the uber jar i created) and logging in my app. > > When i run my everything is running fine except i cannot find my > application logs any where > > I am running the Flink job/app with (-p 2) i see two task managers and when > i looked into the logs (none of the app specific logs can be found in those > logs). We are using the INFO Level logging. > > I was hoping the logs will go to default Console Appender. > > In the Master node Flink Conf i have found logback-console.xml (which sets > the root level logging to INFO) and is using Console Appender and there is > also log4j.properties file which also sets the Root Level logging to INFO > and is using FileAppender > > I also tried to access the logs using "yarn logs --applicationId <>" i am > getting > > $ yarn logs -applicationId application_1593579475717_0001 > 20/07/01 21:16:32 INFO client.RMProxy: Connecting to ResourceManager at > <>:8032 > /var/log/hadoop-yarn/apps/root/logs/application_1593579475717_0001 does not > exist. > Log aggregation has not completed or is not enabled. > > And Yarn Log Aggregation is already enabled. When i checked > /etc/hadoop/conf/yarn-site.xml > > <property> > <name>yarn.log-aggregation-enable</name> > <value>true</value> > </property> > > It might be the case that i can only see the logs through yarn once the > application completes/finishes/fails > > Thanks > Sateesh > > > > -- > Sent from: > http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ -------------------------- Ken Krugler http://www.scaleunlimited.com custom big data solutions & training Hadoop, Cascading, Cassandra & Solr