akshay654 opened a new issue #15914:
URL: https://github.com/apache/airflow/issues/15914
**Apache Airflow version**: 2.0.2
**Kubernetes version (if you are using kubernetes)** (use `kubectl
version`): v1.18.14
**Environment**:
- **Cloud provider or hardware configuration**: AKS
**What happened**:
When we are trying to use helm chart for Airflow deployment getting issue
with remote logging for Azure File share.
The scheduler and dag operator logs are written successfully but with task
logs(DAG run logs) we are facing issue.
The file task handler.py file has a function defined which tries to chown
the file permission which we suppose is not permitted with azure file share
We have customized the templates to log the logs using pre provisioned pvc
which is using file share.
WARNING:root:OSError while attempting to symlink the latest log directory
[2021-05-17 13:36:39,234] {settings.py:210} DEBUG - Setting up DB connection
pool (PID 6)
[2021-05-17 13:36:39,235] {settings.py:281} DEBUG -
settings.prepare_engine_args(): Using pool settings. pool_size=5,
max_overflow=10, pool_recycle=1800, pid=6
[2021-05-17 13:36:39,310] {cli_action_loggers.py:40} DEBUG - Adding
<function default_action_log at 0x7f65def8bd08> to pre execution callback
[2021-05-17 13:36:42,394] {cli_action_loggers.py:66} DEBUG - Calling
callbacks: [<function default_action_log at 0x7f65def8bd08>]
[2021-05-17 13:36:42,438] {settings.py:210} DEBUG - Setting up DB connection
pool (PID 6)
[2021-05-17 13:36:42,439] {settings.py:243} DEBUG -
settings.prepare_engine_args(): Using NullPool
[2021-05-17 13:36:42,440] {dagbag.py:448} INFO - Filling up the DagBag from
/opt/airflow/dags/example_bash_operator.py
[2021-05-17 13:36:42,613] {dagbag.py:287} DEBUG - Importing
/opt/airflow/dags/example_bash_operator.py
[2021-05-17 13:36:42,749] {dagbag.py:413} DEBUG - Loaded DAG <DAG:
example_bash_operator>
[2021-05-17 13:36:42,773] {plugins_manager.py:270} DEBUG - Loading plugins
[2021-05-17 13:36:42,773] {plugins_manager.py:207} DEBUG - Loading plugins
from directory: /opt/airflow/plugins
[2021-05-17 13:36:42,774] {plugins_manager.py:184} DEBUG - Loading plugins
from entrypoints
[2021-05-17 13:36:42,925] {plugins_manager.py:414} DEBUG - Integrate DAG
plugins
[2021-05-17 13:36:43,133] {cli_action_loggers.py:84} DEBUG - Calling
callbacks: []
[2021-05-17 13:36:43,135] {settings.py:292} DEBUG - Disposing DB connection
pool (PID 6)
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/__main__.py", line
40, in main
args.func(args)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/cli_parser.py",
line 48, in command
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line
89, in wrapper
return f(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py",
line 225, in task_run
ti.init_run_context(raw=args.raw)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
line 1987, in init_run_context
self._set_context(self)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/log/logging_mixin.py",
line 54, in _set_context
set_context(self.log, context)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/log/logging_mixin.py",
line 174, in set_context
handler.set_context(value)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/log/file_task_handler.py",
line 56, in set_context
local_loc = self._init_file(ti)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/log/file_task_handler.py",
line 258, in _init_file
os.chmod(full_path, 0o666)
PermissionError: [Errno 1] Operation not permitted:
'/opt/airflow/logs/example_bash_operator/also_run_this/2021-05-17T13:36:28.742664+00:00/1.log'
**What you expected to happen**:
create log files for task pods that run temporarily and for which the logs
share the lifecycle of pod, so we tries logging it into azure file share
**How to reproduce it**:
1. Modify the _helpers.yaml in helm chart templates repo to define the
airflow logs definition similar to that defined for dags to use PVC.
2. Modify the values.yaml file pod template section at the end of file to
include logs for persistence similar to setting already present for dags just
change the existing pvc claim which uses azure file share.
3. Deploy the airflow using helm chart with the modified deployment yaml
templates and values.yaml files.
**Anything else we need to know**: we are using Kubernetes executor
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]