Dubrzr opened a new issue #11206:
URL: https://github.com/apache/airflow/issues/11206


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**: 1.10.12
   
   **Environment**: docker image running ubuntu 20.04
   
   - **OS** (e.g. from /etc/os-release): Ubuntu 20.04
   - **Kernel** (e.g. `uname -a`):  3.10.0-1127.19.1.el7.x86_64
   - **Install tools**: installed with pip
   
   **What happened**:
   
   Installed airflow systemwide with user root and the following command :
   
   ```
   pip install apache-airflow==1.10.12
   ```
   
   Then creating a user on the system in a custom directory (here 
`/export/home/someuser`):
   
   ```
   mkdir -p /export/home
   export NEWUID=20000
   export NEWGID=20000
   export USER_HOME=/export/home/someuser
   /usr/sbin/groupadd -g $NEWGID  someuser
   /usr/sbin/useradd -u $NEWUID -d $USER_HOME -m -g $NEWGID someuser
   ```
   
   Then connecting using this new user, and running initdb, then scheduler:
   
   ```
   su someuser
   airflow initdb
   airflow scheduler
   ```
   
   `initdb` runs fine and creates files in `/export/home/someuser/airflow`
   When running `airflow scheduler`, the following error happens:
   
   ```
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2020-09-30 14:18:23,622] {__init__.py:50} INFO - Using executor 
SequentialExecutor
   [2020-09-30 14:18:23,647] {scheduler_job.py:1367} INFO - Starting the 
scheduler
   [2020-09-30 14:18:23,647] {scheduler_job.py:1375} INFO - Running execute 
loop for -1 seconds
   [2020-09-30 14:18:23,648] {scheduler_job.py:1376} INFO - Processing each 
file at most -1 times
   [2020-09-30 14:18:23,648] {scheduler_job.py:1379} INFO - Searching for files 
in /export/home/someuser/airflow/dags
   [2020-09-30 14:18:23,654] {scheduler_job.py:1381} INFO - There are 25 files 
in /export/home/someuser/airflow/dags
   [2020-09-30 14:18:23,654] {scheduler_job.py:1438} INFO - Resetting orphaned 
tasks for active dag runs
   [2020-09-30 14:18:23,686] {dag_processing.py:562} INFO - Launched 
DagFileProcessorManager with pid: 106
   [2020-09-30 14:18:23,694] {settings.py:55} INFO - Configured default 
timezone <Timezone [UTC]>
   [2020-09-30 14:18:23,706] {dag_processing.py:774} WARNING - Because we 
cannot use more than 1 thread (max_threads = 2) when using sqlite. So we set 
parallelism to 1.
   Process DagFileProcessor0-Process:
   Traceback (most recent call last):
     File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in 
_bootstrap
       self.run()
     File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
       self._target(*self._args, **self._kwargs)
     File 
"/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 
137, in _run_file_processor
       set_context(log, file_path)
     File 
"/usr/local/lib/python3.8/dist-packages/airflow/utils/log/logging_mixin.py", 
line 198, in set_context
       handler.set_context(value)
     File 
"/usr/local/lib/python3.8/dist-packages/airflow/utils/log/file_processor_handler.py",
 line 65, in set_context
       local_loc = self._init_file(filename)
     File 
"/usr/local/lib/python3.8/dist-packages/airflow/utils/log/file_processor_handler.py",
 line 141, in _init_file
       os.makedirs(directory)
     File "/usr/lib/python3.8/os.py", line 213, in makedirs
       makedirs(head, exist_ok=exist_ok)
     File "/usr/lib/python3.8/os.py", line 213, in makedirs
       makedirs(head, exist_ok=exist_ok)
     File "/usr/lib/python3.8/os.py", line 213, in makedirs
       makedirs(head, exist_ok=exist_ok)
     [Previous line repeated 3 more times]
     File "/usr/lib/python3.8/os.py", line 223, in makedirs
       mkdir(name, mode)
   PermissionError: [Errno 13] Permission denied: 
'/export/home/someuser/airflow/logs/scheduler/2020-09-30/../../../../../usr'
   
   ```
   
   The folder rights are the following:
   
   ```
   namei -l /export/home/someuser/airflow/
   f: /export/home/someuser/airflow/
   drwxr-xr-x root     root     /
   drwxr-xr-x root     root     export
   drwxr-xr-x root     root     home
   drwxr-xr-x someuser someuser someuser
   drwxrwxr-x someuser someuser airflow
   ```
   
   **BUT:** When you do the exact same steps, but with a home in 
`/home/someuser` instead of `/export/home/someuser` it works fine, which really 
makes no sense at all:
   
   
   ```
   export NEWUID=20000
   export NEWGID=20000
   export USER_HOME=/home/someuser
   /usr/sbin/groupadd -g $NEWGID  someuser
   /usr/sbin/useradd -u $NEWUID -d $USER_HOME -m -g $NEWGID someuser
   ```
   
   Then connecting using this new user, and running initdb, then scheduler:
   
   ```
   su someuser
   airflow initdb
   airflow scheduler
   ```
   
   `initdb` runs fine and creates files in `/export/home/someuser/airflow`
   When running `airflow scheduler`, the following error happens:
   
   ```
   
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2020-09-30 15:20:33,262] {__init__.py:50} INFO - Using executor 
SequentialExecutor
   [2020-09-30 15:20:33,267] {scheduler_job.py:1367} INFO - Starting the 
scheduler
   [2020-09-30 15:20:33,267] {scheduler_job.py:1375} INFO - Running execute 
loop for -1 seconds
   [2020-09-30 15:20:33,267] {scheduler_job.py:1376} INFO - Processing each 
file at most -1 times
   [2020-09-30 15:20:33,268] {scheduler_job.py:1379} INFO - Searching for files 
in /home/someuser/airflow/dags
   [2020-09-30 15:20:33,274] {scheduler_job.py:1381} INFO - There are 25 files 
in /home/someuser/airflow/dags
   [2020-09-30 15:20:33,274] {scheduler_job.py:1438} INFO - Resetting orphaned 
tasks for active dag runs
   [2020-09-30 15:20:33,298] {dag_processing.py:562} INFO - Launched 
DagFileProcessorManager with pid: 157
   [2020-09-30 15:20:33,303] {settings.py:55} INFO - Configured default 
timezone <Timezone [UTC]>
   [2020-09-30 15:20:33,316] {dag_processing.py:774} WARNING - Because we 
cannot use more than 1 thread (max_threads = 2) when using sqlite. So we set 
parallelism to 1.
   
   ```
   
   The folder rights are the following:
   
   ```
   namei -l /home/someuser/airflow/
   f: /home/someuser/airflow/
   drwxr-xr-x root     root     /
   drwxr-xr-x root     root     home
   drwxr-xr-x someuser someuser someuser
   drwxrwxr-x someuser someuser airflow
   ```
   
   
   
   **What you expected to happen**:
   
   Something is probably wrong in 
[here](https://github.com/apache/airflow/blob/v1-10-stable/airflow/utils/log/file_processor_handler.py#L135)
 when constructing relative paths, _render_filename takes as input 
`filename=/usr/local/lib/python3.8/dist-packages/airflow/example_dags/example_docker_swarm_operator.py`
 and 
[here](https://github.com/apache/airflow/blob/v1-10-stable/airflow/utils/log/file_processor_handler.py#L87)
 `filename = os.path.relpath(filename, self.dag_dir)` takes 
`self.dag_dir=/home/airflow/airflow_data/dags` as the second argument, which 
results in 
`filename=../../../../../usr/local/lib/python3.8/dist-packages/airflow/example_dags/example_docker_swarm_operator.py`
 which is probably not a correct path. 
   
   And i really don't get why it works with a user in /home, and not an user in 
/export/home.
   
   **How to reproduce it**:
   
   To reproduce, build the following docker image:
   
   ```
   ARG BASE_IMAGE="ubuntu:20.04"
   FROM ${BASE_IMAGE} as min
   
   SHELL ["/bin/bash", "-o", "pipefail", "-e", "-u", "-x", "-c"]
   
   ARG AIRFLOW_VERSION="1.10.12"
   ENV AIRFLOW_VERSION=$AIRFLOW_VERSION
   
   # Make sure noninteractive debian install is used and language variables set
   ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 
LC_ALL=C.UTF-8 \
       LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
   
   # Install basic and additional apt dependencies
   RUN apt-get update \
       && apt-get install -y --no-install-recommends \
              python3 \
              python3-pip \
              python3-dev \
              apt-utils \
              build-essential \
              freetds-bin \
              krb5-user \
              ldap-utils \
              libffi7 \
              libsasl2-2 \
              libsasl2-modules \
              libssl1.1 \
              locales  \
              lsb-release \
              sasl2-bin \
              sqlite3 \
              unixodbc \
       && apt-get autoremove -yqq --purge \
       && apt-get clean \
       && rm -rf /var/lib/apt/lists/*
   
   # Setup PIP
   # By default PIP install run without cache to make image smaller
   ARG PIP_NO_CACHE_DIR="true"
   ENV PIP_NO_CACHE_DIR=${PIP_NO_CACHE_DIR}
   RUN echo "Pip no cache dir: ${PIP_NO_CACHE_DIR}"
   
   RUN http_proxy=$http_proxy https_proxy=$http_proxy pip3 install 
apache-airflow==$AIRFLOW_VERSION
   
   ENTRYPOINT ["/bin/bash"]
   ```
   
   ```
   docker build -t airflow_1.10.12 .
   ```
   
   Then run the image:
   
   ```
   docker run  --name=airflow_test -it airflow_1.10.12
   ```
   
   And execute the above commands.
   
   Thanks for your help.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to