I second till's suggestion. Currently in container environment(docker/K8s),
we could not output the
STDOUT/STDERR to console and log files(taskmanager.out/err) simultaneously.
In consideration
of the user experience, we are using "conf/log4j-console.properties" to
only output the STDOUT/STDERR
to console. Then users could use "docker logs <ContainerID>" or "kubectl
logs <PodName>" to view
the logs easily.

Except for disabling the logging of TaskManagerStdoutFileHandler
in log4j-console.properties, you
could also customize the image entrypoint to redirect the STDOUT/STDERR to
separate files(taskmanager.out/err).


Best,
Yang

Till Rohrmann <trohrm...@apache.org> 于2020年10月8日周四 下午3:30写道:

> The easiest way to suppress this error would be to disable the logging for
> TaskManagerStdoutFileHandler in your log4j.properties file.
>
> Cheers,
> Till
>
> On Wed, Oct 7, 2020 at 8:48 PM sidhant gupta <sidhan...@gmail.com> wrote:
>
>> Hi Till,
>>
>> I understand the errors which appears in my logs are not stopping me from
>> running the job. I am running flink session cluster in ECS and also
>> configured graylog to get the container logs. So getting the docker logs is
>> also not an issue.
>> But is there a way to suppress this error or any work around ?
>>
>> Thanks
>> Sidhant Gupta
>>
>> On Wed, Oct 7, 2020, 9:15 PM Till Rohrmann <trohrm...@apache.org> wrote:
>>
>>> Hi Sidhant,
>>>
>>> when using Flink's Docker image, then the cluster won't create the out
>>> files. Instead the components will directly write to STDOUT which is
>>> captured by Kubernetes and can be viewed using `kubectl logs POD_NAME`. The
>>> error which appears in your logs is not a problem. It is simply the REST
>>> handler which tries to serve the out files.
>>>
>>> Cheers,
>>> Till
>>>
>>> On Wed, Oct 7, 2020 at 5:11 PM 大森林 <appleyu...@foxmail.com> wrote:
>>>
>>>> what's your running mode?
>>>> if your flink cluster is on yarn mode,then the output you need has no
>>>> relation to $FLINK_HOME/logs/*.out
>>>>
>>>>
>>>> ------------------ 原始邮件 ------------------
>>>> *发件人:* "sidhant gupta" <sidhan...@gmail.com>;
>>>> *发送时间:* 2020年10月7日(星期三) 晚上11:33
>>>> *收件人:* "大森林"<appleyu...@foxmail.com>;"user"<user@flink.apache.org>;
>>>> *主题:* Re: The file STDOUT does not exist on the TaskExecutor
>>>>
>>>> Hi,
>>>>
>>>> I'm running flink cluster in ecs. There is a pipeline which creates the
>>>> job manager and then the task manager using the docker image.
>>>>
>>>> Not sure if we would want to restart the cluster in production.
>>>>
>>>> Is there any way we can make sure the .out files will be created
>>>> without restart ?
>>>>
>>>> I am able to see the logs in the logs tab but not the stdout logs in
>>>> the web ui and getting the below mentioned error after running the job.
>>>>
>>>> Thanks
>>>> Sidhant Gupta
>>>>
>>>>
>>>> On Wed, Oct 7, 2020, 8:00 PM 大森林 <appleyu...@foxmail.com> wrote:
>>>>
>>>>> it's easy,
>>>>> just restart your flink cluster(standalone mode)
>>>>>
>>>>> if you run flink in yarn mode,then the result will display on
>>>>> $HADOOP/logs/*.out files
>>>>>
>>>>> ------------------ 原始邮件 ------------------
>>>>> *发件人:* "sidhant gupta" <sidhan...@gmail.com>;
>>>>> *发送时间:* 2020年10月7日(星期三) 晚上9:52
>>>>> *收件人:* "大森林"<appleyu...@foxmail.com>;
>>>>> *抄送:* "user"<user@flink.apache.org>;
>>>>> *主题:* Re: The file STDOUT does not exist on the TaskExecutor
>>>>>
>>>>> ++ user
>>>>>
>>>>> On Wed, Oct 7, 2020, 6:47 PM sidhant gupta <sidhan...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi
>>>>>>
>>>>>> I checked in the $FLINK_HOME/logs. The .out file was not there. Can
>>>>>> you suggest what should be the action item ?
>>>>>>
>>>>>> Thanks
>>>>>> Sidhant Gupta
>>>>>>
>>>>>>
>>>>>> On Wed, Oct 7, 2020, 7:17 AM 大森林 <appleyu...@foxmail.com> wrote:
>>>>>>
>>>>>>>
>>>>>>> check if the .out file is in $FLINK_HOME/logs  please.
>>>>>>>
>>>>>>> ------------------ 原始邮件 ------------------
>>>>>>> *发件人:* "sidhant gupta" <sidhan...@gmail.com>;
>>>>>>> *发送时间:* 2020年10月7日(星期三) 凌晨1:52
>>>>>>> *收件人:* "大森林"<appleyu...@foxmail.com>;
>>>>>>> *主题:* Re: The file STDOUT does not exist on the TaskExecutor
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am just running the docker container as it is by adding just the
>>>>>>> conf/flink.yaml .
>>>>>>> I am not sure if the .out file got deleted. Do we need to expose
>>>>>>> some ports ?
>>>>>>>
>>>>>>> Thanks
>>>>>>> Sidhant Gupta
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Oct 6, 2020, 8:51 PM 大森林 <appleyu...@foxmail.com> wrote:
>>>>>>>
>>>>>>>>
>>>>>>>> Hi,I guess you may deleted .out file in $FLINK_HOME/logs.
>>>>>>>> you can just use your default log settings.
>>>>>>>> ------------------ 原始邮件 ------------------
>>>>>>>> *发件人:* "sidhant gupta" <sidhan...@gmail.com>;
>>>>>>>> *发送时间:* 2020年10月6日(星期二) 晚上10:59
>>>>>>>> *收件人:* "user"<user@flink.apache.org>;
>>>>>>>> *主题:* The file STDOUT does not exist on the TaskExecutor
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I am running dockerized flink:1.11.0-scala_2.11 container in ecs.
>>>>>>>> I am getting the following error after the job runs:
>>>>>>>>
>>>>>>>> ERROR org.apache.flink.runtime.rest.handler.taskmanager.
>>>>>>>> TaskManagerStdoutFileHandler [] - Unhandled exception.
>>>>>>>> org.apache.flink.util.FlinkException: The file STDOUT does not
>>>>>>>> exist on the TaskExecutor.
>>>>>>>>     at org.apache.flink.runtime.taskexecutor.TaskExecutor
>>>>>>>> .lambda$requestFileUploadByFilePath$25(TaskExecutor.java:1742)
>>>>>>>> ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>>>>>>>>     at java.util.concurrent.CompletableFuture$AsyncSupply.run(
>>>>>>>> CompletableFuture.java:1604) ~[?:1.8.0_262]
>>>>>>>>     at java.util.concurrent.ThreadPoolExecutor.runWorker(
>>>>>>>> ThreadPoolExecutor.java:1149) ~[?:1.8.0_262]
>>>>>>>>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>>>>>>>> ThreadPoolExecutor.java:624) ~[?:1.8.0_262]
>>>>>>>>     at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_262]
>>>>>>>>
>>>>>>>> I guess "file" needs to be added in log4j.properties in the docker
>>>>>>>> container e.g. log4j.rootLogger=INFO, file
>>>>>>>> Are there any other properties which needs to be configured in any
>>>>>>>> of the other property files or any jar needs to be added in the 
>>>>>>>> */opt/flink
>>>>>>>> *path ?
>>>>>>>> Thanks
>>>>>>>> Sidhant Gupta
>>>>>>>>
>>>>>>>>

Reply via email to