This is an automated email from the ASF dual-hosted git repository. ephraimanierobi pushed a commit to branch update-2-3-0-doc in repository https://gitbox.apache.org/repos/asf/airflow-site.git
commit 1852a16126da489007b9dd051a158c173525c929 Author: Ephraim Anierobi <splendidzig...@gmail.com> AuthorDate: Wed May 4 19:20:56 2022 +0100 Remove references to python 3.6 from doc --- .../2.3.0/_sources/start/local.rst.txt | 4 +- .../_sources/upgrading-from-1-10/index.rst.txt | 2 +- docs-archive/docker-stack/_sources/build.rst.txt | 12 +-- docs-archive/docker-stack/_static/_gen/js/docs.js | 2 +- docs-archive/docker-stack/build-arg-ref.html | 14 ++-- docs-archive/docker-stack/build.html | 96 +++++++++++----------- docs-archive/docker-stack/changelog.html | 4 +- docs-archive/docker-stack/entrypoint.html | 40 ++++----- docs-archive/docker-stack/index.html | 10 +-- docs-archive/docker-stack/searchindex.js | 2 +- 10 files changed, 93 insertions(+), 93 deletions(-) diff --git a/docs-archive/apache-airflow/2.3.0/_sources/start/local.rst.txt b/docs-archive/apache-airflow/2.3.0/_sources/start/local.rst.txt index 0c786369e..b617d095f 100644 --- a/docs-archive/apache-airflow/2.3.0/_sources/start/local.rst.txt +++ b/docs-archive/apache-airflow/2.3.0/_sources/start/local.rst.txt @@ -49,9 +49,9 @@ constraint files to enable reproducible installation, so using ``pip`` and const # Install Airflow using the constraints file AIRFLOW_VERSION=|version| PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)" - # For example: 3.6 + # For example: 3.7 CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt" - # For example: https://raw.githubusercontent.com/apache/airflow/constraints-|version|/constraints-3.6.txt + # For example: https://raw.githubusercontent.com/apache/airflow/constraints-|version|/constraints-3.7.txt pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}" # The Standalone command will initialise the database, make a user, diff --git a/docs-archive/apache-airflow/2.3.0/_sources/upgrading-from-1-10/index.rst.txt b/docs-archive/apache-airflow/2.3.0/_sources/upgrading-from-1-10/index.rst.txt index c9cfd8d64..76bffbd2f 100644 --- a/docs-archive/apache-airflow/2.3.0/_sources/upgrading-from-1-10/index.rst.txt +++ b/docs-archive/apache-airflow/2.3.0/_sources/upgrading-from-1-10/index.rst.txt @@ -31,7 +31,7 @@ Step 1: Switch to Python 3 '''''''''''''''''''''''''' Airflow 1.10 was the last release series to support Python 2. Airflow 2.0.0 -requires Python 3.6+ and has been tested with Python versions 3.6, 3.7 and 3.8. +requires Python 3.7+ and has been tested with Python versions 3.6, 3.7 and 3.8. Python 3.9 support was added from Airflow 2.1.2. If you have a specific task that still requires Python 2 then you can use the :class:`~airflow.operators.python.PythonVirtualenvOperator` or the ``KubernetesPodOperator`` for this. diff --git a/docs-archive/docker-stack/_sources/build.rst.txt b/docs-archive/docker-stack/_sources/build.rst.txt index a28d162ce..808cbc202 100644 --- a/docs-archive/docker-stack/_sources/build.rst.txt +++ b/docs-archive/docker-stack/_sources/build.rst.txt @@ -550,16 +550,16 @@ Building from PyPI packages This is the basic way of building the custom images from sources. -The following example builds the production image in version ``3.6`` with latest PyPI-released Airflow, -with default set of Airflow extras and dependencies. The ``2.0.2`` constraints are used automatically. +The following example builds the production image in version ``3.7`` with latest PyPI-released Airflow, +with default set of Airflow extras and dependencies. The latest PyPI-released Airflow constraints are used automatically. .. exampleinclude:: docker-examples/customizing/stable-airflow.sh :language: bash :start-after: [START build] :end-before: [END build] -The following example builds the production image in version ``3.7`` with default extras from ``2.0.2`` PyPI -package. The ``2.0.2`` constraints are used automatically. +The following example builds the production image in version ``3.7`` with default extras from ``2.3.0`` Airflow +package. The ``2.3.0`` constraints are used automatically. .. exampleinclude:: docker-examples/customizing/pypi-selected-version.sh :language: bash @@ -567,7 +567,7 @@ package. The ``2.0.2`` constraints are used automatically. :end-before: [END build] The following example builds the production image in version ``3.8`` with additional airflow extras -(``mssql,hdfs``) from ``2.0.2`` PyPI package, and additional dependency (``oauth2client``). +(``mssql,hdfs``) from ``2.3.0`` PyPI package, and additional dependency (``oauth2client``). .. exampleinclude:: docker-examples/customizing/pypi-extras-and-deps.sh :language: bash @@ -593,7 +593,7 @@ have more complex dependencies to build. Building optimized images ......................... -The following example the production image in version ``3.6`` with additional airflow extras from ``2.0.2`` +The following example the production image in version ``3.7`` with additional airflow extras from ``2.0.2`` PyPI package but it includes additional apt dev and runtime dependencies. The dev dependencies are those that require ``build-essential`` and usually need to involve recompiling diff --git a/docs-archive/docker-stack/_static/_gen/js/docs.js b/docs-archive/docker-stack/_static/_gen/js/docs.js index 57efdbeca..90898d86d 100644 --- a/docs-archive/docker-stack/_static/_gen/js/docs.js +++ b/docs-archive/docker-stack/_static/_gen/js/docs.js @@ -1 +1 @@ -!function(r){var n={};function o(t){if(n[t])return n[t].exports;var e=n[t]={i:t,l:!1,exports:{}};return r[t].call(e.exports,e,e.exports,o),e.l=!0,e.exports}o.m=r,o.c=n,o.d=function(t,e,r){o.o(t,e)||Object.defineProperty(t,e,{enumerable:!0,get:r})},o.r=function(t){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(t,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(t,"__esModule",{value:!0})},o.t=function(e,t){if(1&t&&(e=o(e)),8&t)return e;if(4&t&&"object"==typ [...] \ No newline at end of file +!function(r){var n={};function o(t){if(n[t])return n[t].exports;var e=n[t]={i:t,l:!1,exports:{}};return r[t].call(e.exports,e,e.exports,o),e.l=!0,e.exports}o.m=r,o.c=n,o.d=function(t,e,r){o.o(t,e)||Object.defineProperty(t,e,{enumerable:!0,get:r})},o.r=function(t){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(t,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(t,"__esModule",{value:!0})},o.t=function(e,t){if(1&t&&(e=o(e)),8&t)return e;if(4&t&&"object"==typ [...] \ No newline at end of file diff --git a/docs-archive/docker-stack/build-arg-ref.html b/docs-archive/docker-stack/build-arg-ref.html index 6c5e90e67..5a75ba653 100644 --- a/docs-archive/docker-stack/build-arg-ref.html +++ b/docs-archive/docker-stack/build-arg-ref.html @@ -687,7 +687,7 @@ for examples of using those arguments.</p> <tbody> <tr class="row-even"><td><p><code class="docutils literal notranslate"><span class="pre">UPGRADE_TO_NEWER_DEPENDENCIES</span></code></p></td> <td><p><code class="docutils literal notranslate"><span class="pre">false</span></code></p></td> -<td><p>If set to a value different than "false" +<td><p>If set to a value different than “false” the dependencies are upgraded to newer versions. In CI it is set to build id to make sure subsequent builds are not @@ -806,7 +806,7 @@ be useful if you need to build Airflow in environments that require high levels <code class="docutils literal notranslate"><span class="pre">apache-airflow</span></code> for installation from PyPI. It can be GitHub repository URL including branch or tag to install from -that repository or "." to install from +that repository or “.” to install from local sources. Installing from sources requires appropriate values of the <code class="docutils literal notranslate"><span class="pre">AIRFLOW_SOURCES_FROM</span></code> and @@ -815,27 +815,27 @@ below)</p></td> </tr> <tr class="row-odd"><td><p><code class="docutils literal notranslate"><span class="pre">AIRFLOW_SOURCES_FROM</span></code></p></td> <td><p><code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code></p></td> -<td><p>Sources of Airflow. Set it to "." when +<td><p>Sources of Airflow. Set it to “.” when you install Airflow from local sources</p></td> </tr> <tr class="row-even"><td><p><code class="docutils literal notranslate"><span class="pre">AIRFLOW_SOURCES_TO</span></code></p></td> <td><p><code class="docutils literal notranslate"><span class="pre">/Dockerfile</span></code></p></td> <td><p>Target for Airflow sources. Set to -"/opt/airflow" when you install Airflow +“/opt/airflow” when you install Airflow from local sources.</p></td> </tr> <tr class="row-odd"><td><p><code class="docutils literal notranslate"><span class="pre">AIRFLOW_SOURCES_WWW_FROM</span></code></p></td> <td><p><code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code></p></td> <td><p>Sources of Airflow WWW files used for asset compilation. Set it to -"./airflow/www" when +“./airflow/www” when you install Airflow from local sources</p></td> </tr> <tr class="row-even"><td><p><code class="docutils literal notranslate"><span class="pre">AIRFLOW_SOURCES_WWW_TO</span></code></p></td> <td><p><code class="docutils literal notranslate"><span class="pre">/Dockerfile</span></code></p></td> <td><p>Target for Airflow files used for asset compilation. Set it to -"/opt/airflow/airflow/www" when +“/opt/airflow/airflow/www” when you install Airflow from local sources.</p></td> </tr> <tr class="row-odd"><td><p><code class="docutils literal notranslate"><span class="pre">AIRFLOW_VERSION_SPECIFICATION</span></code></p></td> @@ -858,7 +858,7 @@ installing from PyPI or GitHub repo.</p></td> source of the constraints with the specified URL or file. Note that the file has to be in Docker context so -it's best to place such file in +it’s best to place such file in one of the folders included in <code class="docutils literal notranslate"><span class="pre">.dockerignore</span></code> file.</p></td> </tr> diff --git a/docs-archive/docker-stack/build.html b/docs-archive/docker-stack/build.html index 2710a79aa..c1e43ad1a 100644 --- a/docs-archive/docker-stack/build.html +++ b/docs-archive/docker-stack/build.html @@ -639,7 +639,7 @@ starting your containers, but this is a bad idea for multiple reasons - starting and ending with the extra time needed to install those packages - which has to happen every time every container starts. The only viable way to deal with new dependencies and requirements in production is to build and use your own image. You should only use installing dependencies dynamically in case of -"hobbyist" and "quick start" scenarios when you want to iterate quickly to try things out and later +“hobbyist” and “quick start” scenarios when you want to iterate quickly to try things out and later replace it with your own images.</p> </div> <div class="section" id="how-to-build-your-own-image"> @@ -681,7 +681,7 @@ method your image will be deployed. This can be set for example as image name in <li><p>[Optional] Test the image. Airflow contains tool that allows you to test the image. This step however, requires locally checked out or extracted Airflow sources. If you happen to have the sources you can test the image by running this command (in airflow root folder). The output will tell you if the image -is "good-to-go".</p></li> +is “good-to-go”.</p></li> </ol> <div class="highlight-shell notranslate"><div class="highlight"><pre><span></span>./scripts/ci/tools/verify_docker_image.sh PROD my-image:0.0.1 </pre></div> @@ -690,16 +690,16 @@ is "good-to-go".</p></li> <li><p>Once you build the image locally you have usually several options to make them available for your deployment:</p></li> </ol> <ul class="simple"> -<li><p>For <code class="docutils literal notranslate"><span class="pre">docker-compose</span></code> deployment, if you've already built your image, and want to continue +<li><p>For <code class="docutils literal notranslate"><span class="pre">docker-compose</span></code> deployment, if you’ve already built your image, and want to continue building the image manually when needed with <code class="docutils literal notranslate"><span class="pre">docker</span> <span class="pre">build</span></code>, you can edit the -docker-compose.yaml and replace the "apache/airflow:<version>" image with the -image you've just built <code class="docutils literal notranslate"><span class="pre">my-image:0.0.1</span></code> - it will be used from your local Docker +docker-compose.yaml and replace the “apache/airflow:<version>” image with the +image you’ve just built <code class="docutils literal notranslate"><span class="pre">my-image:0.0.1</span></code> - it will be used from your local Docker Engine cache. You can also simply set <code class="docutils literal notranslate"><span class="pre">AIRFLOW_IMAGE_NAME</span></code> variable to point to your image and <code class="docutils literal notranslate"><span class="pre">docker-compose</span></code> will use it automatically without having to modify the file.</p></li> <li><p>Also for <code class="docutils literal notranslate"><span class="pre">docker-compose</span></code> deployment, you can delegate image building to the docker-compose. -To do that - open your <code class="docutils literal notranslate"><span class="pre">docker-compose.yaml</span></code> file and search for the phrase "In order to add custom dependencies". -Follow these instructions of commenting the "image" line and uncommenting the "build" line. +To do that - open your <code class="docutils literal notranslate"><span class="pre">docker-compose.yaml</span></code> file and search for the phrase “In order to add custom dependencies”. +Follow these instructions of commenting the “image” line and uncommenting the “build” line. This is a standard docker-compose feature and you can read about it in <a class="reference external" href="https://docs.docker.com/compose/reference/build/">Docker Compose build reference</a>. Run <code class="docutils literal notranslate"><span class="pre">docker-compose</span> <span class="pre">build</span></code> to build the images. Similarly as in the previous case, the @@ -813,7 +813,7 @@ how you want to build your image.</p> </tr> </thead> <tbody> -<tr class="row-even"><td><p>Uses familiar 'FROM ' pattern of image building</p></td> +<tr class="row-even"><td><p>Uses familiar ‘FROM ‘ pattern of image building</p></td> <td><p>Yes</p></td> <td><p>No</p></td> </tr> @@ -839,10 +839,10 @@ how you want to build your image.</p> </tr> </tbody> </table> -<p>TL;DR; If you have a need to build custom image, it is easier to start with "Extending" however if your +<p>TL;DR; If you have a need to build custom image, it is easier to start with “Extending” however if your dependencies require compilation step or when your require to build the image from security vetted -packages, switching to "Customizing" the image provides much more optimized images. In the example further -where we compare equivalent "Extending" and "Customizing" the image, similar images build by +packages, switching to “Customizing” the image provides much more optimized images. In the example further +where we compare equivalent “Extending” and “Customizing” the image, similar images build by Extending vs. Customization had shown 1.1GB vs 874MB image sizes respectively - with 20% improvement in size of the Customized image.</p> <div class="admonition note"> @@ -851,9 +851,9 @@ size of the Customized image.</p> optimized base image first using <code class="docutils literal notranslate"><span class="pre">customization</span></code> method (for example by your admin team) with all the heavy compilation required dependencies and you can publish it in your registry and let others <code class="docutils literal notranslate"><span class="pre">extend</span></code> your image using <code class="docutils literal notranslate"><span class="pre">FROM</span></code> and add their own lightweight dependencies. This reflects well -the split where typically "Casual" users will Extend the image and "Power-users" will customize it.</p> +the split where typically “Casual” users will Extend the image and “Power-users” will customize it.</p> </div> -<p>Airflow Summit 2020's <a class="reference external" href="https://youtu.be/wDr3Y7q2XoI">Production Docker Image</a> talk provides more +<p>Airflow Summit 2020’s <a class="reference external" href="https://youtu.be/wDr3Y7q2XoI">Production Docker Image</a> talk provides more details about the context, architecture and customization/extension methods for the Production Image.</p> </div> <div class="section" id="extending-the-image"> @@ -862,18 +862,18 @@ details about the context, architecture and customization/extension methods for compiling. The compilation framework of Linux (so called <code class="docutils literal notranslate"><span class="pre">build-essential</span></code>) is pretty big, and for the production images, size is really important factor to optimize for, so our Production Image does not contain <code class="docutils literal notranslate"><span class="pre">build-essential</span></code>. If you need compiler like gcc or g++ or make/cmake etc. - those -are not found in the image and it is recommended that you follow the "customize" route instead.</p> +are not found in the image and it is recommended that you follow the “customize” route instead.</p> <p>How to extend the image - it is something you are most likely familiar with - simply -build a new image using Dockerfile's <code class="docutils literal notranslate"><span class="pre">FROM</span></code> directive and add whatever you need. Then you can add your +build a new image using Dockerfile’s <code class="docutils literal notranslate"><span class="pre">FROM</span></code> directive and add whatever you need. Then you can add your Debian dependencies with <code class="docutils literal notranslate"><span class="pre">apt</span></code> or PyPI dependencies with <code class="docutils literal notranslate"><span class="pre">pip</span> <span class="pre">install</span></code> or any other stuff you need.</p> <p>You should be aware, about a few things:</p> <ul class="simple"> -<li><p>The production image of airflow uses "airflow" user, so if you want to add some of the tools +<li><p>The production image of airflow uses “airflow” user, so if you want to add some of the tools as <code class="docutils literal notranslate"><span class="pre">root</span></code> user, you need to switch to it with <code class="docutils literal notranslate"><span class="pre">USER</span></code> directive of the Dockerfile and switch back to <code class="docutils literal notranslate"><span class="pre">airflow</span></code> user when you are done. Also you should remember about following the <a class="reference external" href="https://docs.docker.com/develop/develop-images/dockerfile_best-practices/">best practices of Dockerfiles</a> to make sure your image is lean and small.</p></li> -<li><p>The PyPI dependencies in Apache Airflow are installed in the user library, of the "airflow" user, so +<li><p>The PyPI dependencies in Apache Airflow are installed in the user library, of the “airflow” user, so PIP packages are installed to <code class="docutils literal notranslate"><span class="pre">~/.local</span></code> folder as if the <code class="docutils literal notranslate"><span class="pre">--user</span></code> flag was specified when running PIP. Note also that using <code class="docutils literal notranslate"><span class="pre">--no-cache-dir</span></code> is a good idea that can help to make your image smaller.</p></li> </ul> @@ -885,7 +885,7 @@ variable to <code class="docutils literal notranslate"><span class="pre">true</s </div> <ul class="simple"> <li><p>If your apt, or PyPI dependencies require some of the <code class="docutils literal notranslate"><span class="pre">build-essential</span></code> or other packages that need -to compile your python dependencies, then your best choice is to follow the "Customize the image" route, +to compile your python dependencies, then your best choice is to follow the “Customize the image” route, because you can build a highly-optimized (for size) image this way. However it requires you to use the Dockerfile that is released as part of Apache Airflow sources (also available at <a class="reference external" href="https://github.com/apache/airflow/blob/main/Dockerfile">Dockerfile</a>)</p></li> @@ -1015,7 +1015,7 @@ running the container.</p> </div> </div> <p>The size of this image is ~ 1.1 GB when build. As you will see further, you can achieve 20% reduction in -size of the image in case you use "Customizing" rather than "Extending" the image.</p> +size of the image in case you use “Customizing” rather than “Extending” the image.</p> </div> <div class="section" id="example-when-you-want-to-embed-dags"> <h3>Example when you want to embed DAGs<a class="headerlink" href="#example-when-you-want-to-embed-dags" title="Permalink to this headline">¶</a></h3> @@ -1068,7 +1068,7 @@ size of the image in case you use "Customizing" rather than "Exte <p>BREAKING CHANGE! As of Airflow 2.3.0 you need to use <a class="reference external" href="https://docs.docker.com/develop/develop-images/build_enhancements/">Buildkit</a> to build customized Airflow Docker image. We are using new features of Building (and <code class="docutils literal notranslate"><span class="pre">dockerfile:1.4</span></code> syntax) -to make our image faster to build and "standalone" - i.e. not needing any extra files from +to make our image faster to build and “standalone” - i.e. not needing any extra files from Airflow in order to be build. As of Airflow 2.3.0, the <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> that is released with Airflow does not need any extra folders or files and can be copied and used from any folder. Previously you needed to copy Airflow sources together with the Dockerfile as some scripts were @@ -1104,17 +1104,17 @@ the arg (see <a class="reference internal" href="#using-docker-context-files"><s <p>Customizing the image is an optimized way of adding your own dependencies to the image - better suited to prepare highly optimized (for size) production images, especially when you have dependencies that require to be compiled before installing (such as <code class="docutils literal notranslate"><span class="pre">mpi4py</span></code>).</p> -<p>It also allows more sophisticated usages, needed by "Power-users" - for example using forked version +<p>It also allows more sophisticated usages, needed by “Power-users” - for example using forked version of Airflow, or building the images from security-vetted sources.</p> <p>The big advantage of this method is that it produces optimized image even if you need some compile-time dependencies that are not needed in the final image.</p> <p>The disadvantage it that building the image takes longer and it requires you to use the Dockerfile that is released as part of Apache Airflow sources.</p> <p>The disadvantage is that the pattern of building Docker images with <code class="docutils literal notranslate"><span class="pre">--build-arg</span></code> is less familiar -to developers of such images. However it is quite well-known to "power-users". That's why the +to developers of such images. However it is quite well-known to “power-users”. That’s why the customizing flow is better suited for those users who have more familiarity and have more custom requirements.</p> -<p>The image also usually builds much longer than the equivalent "Extended" image because instead of +<p>The image also usually builds much longer than the equivalent “Extended” image because instead of extending the layers that are already coming from the base image, it rebuilds the layers needed to add extra dependencies needed at early stages of image building.</p> <p>When customizing the image you can choose a number of options how you install Airflow:</p> @@ -1129,7 +1129,7 @@ want to release the custom Airflow version to PyPI.</p></li> particularly useful if you want to build Airflow in a highly-secure environment where all such packages must be vetted by your security team and stored in your private artifact registry. This also allows to build airflow image in an air-gaped environment.</p></li> -<li><p>Side note. Building <code class="docutils literal notranslate"><span class="pre">Airflow</span></code> in an <code class="docutils literal notranslate"><span class="pre">air-gaped</span></code> environment sounds pretty funny, doesn't it?</p></li> +<li><p>Side note. Building <code class="docutils literal notranslate"><span class="pre">Airflow</span></code> in an <code class="docutils literal notranslate"><span class="pre">air-gaped</span></code> environment sounds pretty funny, doesn’t it?</p></li> </ul> <p>You can also add a range of customizations while building the image:</p> <ul class="simple"> @@ -1154,7 +1154,7 @@ version of Airflow you use.</p> <p>You can also download any version of Airflow constraints and adapt it with your own set of constraints and manually set your own versions of dependencies in your own constraints and use the version of constraints that you manually prepared.</p> -<p>You can read more about constraints in <a class="reference external" href="/docs/apache-airflow/stable/installation/installing-from-pypi.html" title="(in apache-airflow v2.3.0.dev0)"><span>Installation from PyPI</span></a></p> +<p>You can read more about constraints in <a class="reference external" href="/docs/apache-airflow/stable/installation/installing-from-pypi.html" title="(in apache-airflow v2.4.0.dev0)"><span>Installation from PyPI</span></a></p> <p>Note that if you place <code class="docutils literal notranslate"><span class="pre">requirements.txt</span></code> in the <code class="docutils literal notranslate"><span class="pre">docker-context-files</span></code> folder, it will be used to install all requirements declared there. It is recommended that the file contains specified version of dependencies to add with <code class="docutils literal notranslate"><span class="pre">==</span></code> version specifier, to achieve @@ -1226,7 +1226,7 @@ docker run -it my-beautifulsoup4-airflow:0.0.1 python -c <span class="s1">'i </div> <ul class="simple"> <li><p>you can place <code class="docutils literal notranslate"><span class="pre">.whl</span></code> packages that you downloaded and install them with -<code class="docutils literal notranslate"><span class="pre">INSTALL_PACKAGES_FROM_CONTEXT</span></code> set to <code class="docutils literal notranslate"><span class="pre">true</span></code> . It's useful if you build the image in +<code class="docutils literal notranslate"><span class="pre">INSTALL_PACKAGES_FROM_CONTEXT</span></code> set to <code class="docutils literal notranslate"><span class="pre">true</span></code> . It’s useful if you build the image in restricted security environments (see: <a class="reference internal" href="#image-build-secure-environments"><span class="std std-ref">Build images in security restricted environments</span></a> for details):</p></li> </ul> <div class="example-block-wrapper docutils container"> @@ -1255,7 +1255,7 @@ pip download --dest docker-context-files <span class="se">\</span> in main directory without creating a dedicated folder, however this is a good practice to keep any files that you copy to the image context in a sub-folder. This makes it easier to separate things that are used on the host from those that are passed in Docker context. Of course, by default when you run -<code class="docutils literal notranslate"><span class="pre">docker</span> <span class="pre">build</span> <span class="pre">.</span></code> the whole folder is available as "Docker build context" and sent to the docker +<code class="docutils literal notranslate"><span class="pre">docker</span> <span class="pre">build</span> <span class="pre">.</span></code> the whole folder is available as “Docker build context” and sent to the docker engine, but the <code class="docutils literal notranslate"><span class="pre">DOCKER_CONTEXT_FILES</span></code> are always copied to the <code class="docutils literal notranslate"><span class="pre">build</span></code> segment of the image so copying all your local folder might unnecessarily increase time needed to build the image and your cache will be invalidated every time any of the files in your local folder change.</p> @@ -1269,7 +1269,7 @@ in order to enable <code class="docutils literal notranslate"><span class="pre"> the <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> that is released with Airflow does not need any extra folders or files and can be copied and used from any folder. Previously you needed to copy Airflow sources together with the Dockerfile as some scripts were needed to make it work. With Airflow 2.3.0, we are using <code class="docutils literal notranslate"><span class="pre">Buildkit</span></code> -features that enable us to make the <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> a completely standalone file that can be used "as-is".</p> +features that enable us to make the <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> a completely standalone file that can be used “as-is”.</p> </div> </div> <div class="section" id="examples-of-image-customizing"> @@ -1277,8 +1277,8 @@ features that enable us to make the <code class="docutils literal notranslate">< <div class="section" id="building-from-pypi-packages"> <span id="image-build-pypi"></span><h3>Building from PyPI packages<a class="headerlink" href="#building-from-pypi-packages" title="Permalink to this headline">¶</a></h3> <p>This is the basic way of building the custom images from sources.</p> -<p>The following example builds the production image in version <code class="docutils literal notranslate"><span class="pre">3.6</span></code> with latest PyPI-released Airflow, -with default set of Airflow extras and dependencies. The <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> constraints are used automatically.</p> +<p>The following example builds the production image in version <code class="docutils literal notranslate"><span class="pre">3.7</span></code> with latest PyPI-released Airflow, +with default set of Airflow extras and dependencies. The latest PyPI-released Airflow constraints are used automatically.</p> <div class="example-block-wrapper docutils container"> <p class="example-header"><span class="example-title">docs/docker-stack/docker-examples/customizing/stable-airflow.sh</span></p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">DOCKER_BUILDKIT</span><span class="o">=</span><span class="m">1</span> @@ -1288,11 +1288,11 @@ docker build . <span class="se">\</span> </pre></div> </div> </div> -<p>The following example builds the production image in version <code class="docutils literal notranslate"><span class="pre">3.7</span></code> with default extras from <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> PyPI -package. The <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> constraints are used automatically.</p> +<p>The following example builds the production image in version <code class="docutils literal notranslate"><span class="pre">3.7</span></code> with default extras from <code class="docutils literal notranslate"><span class="pre">2.3.0</span></code> Airflow +package. The <code class="docutils literal notranslate"><span class="pre">2.3.0</span></code> constraints are used automatically.</p> <div class="example-block-wrapper docutils container"> <p class="example-header"><span class="example-title">docs/docker-stack/docker-examples/customizing/pypi-selected-version.sh</span></p> -<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">AIRFLOW_VERSION</span><span class="o">=</span><span class="m">2</span>.2.4 +<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">AIRFLOW_VERSION</span><span class="o">=</span><span class="m">2</span>.3.0 <span class="nb">export</span> <span class="nv">DOCKER_BUILDKIT</span><span class="o">=</span><span class="m">1</span> docker build . <span class="se">\</span> @@ -1303,10 +1303,10 @@ docker build . <span class="se">\</span> </div> </div> <p>The following example builds the production image in version <code class="docutils literal notranslate"><span class="pre">3.8</span></code> with additional airflow extras -(<code class="docutils literal notranslate"><span class="pre">mssql,hdfs</span></code>) from <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> PyPI package, and additional dependency (<code class="docutils literal notranslate"><span class="pre">oauth2client</span></code>).</p> +(<code class="docutils literal notranslate"><span class="pre">mssql,hdfs</span></code>) from <code class="docutils literal notranslate"><span class="pre">2.3.0</span></code> PyPI package, and additional dependency (<code class="docutils literal notranslate"><span class="pre">oauth2client</span></code>).</p> <div class="example-block-wrapper docutils container"> <p class="example-header"><span class="example-title">docs/docker-stack/docker-examples/customizing/pypi-extras-and-deps.sh</span></p> -<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">AIRFLOW_VERSION</span><span class="o">=</span><span class="m">2</span>.2.2 +<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">AIRFLOW_VERSION</span><span class="o">=</span><span class="m">2</span>.3.0 <span class="nb">export</span> <span class="nv">DEBIAN_VERSION</span><span class="o">=</span><span class="s2">"bullseye"</span> <span class="nb">export</span> <span class="nv">DOCKER_BUILDKIT</span><span class="o">=</span><span class="m">1</span> @@ -1338,19 +1338,19 @@ docker build . <span class="se">\</span> </pre></div> </div> </div> -<p>The above image is equivalent of the "extended" image from previous chapter but its size is only -874 MB. Comparing to 1.1 GB of the "extended image" this is about 230 MB less, so you can achieve ~20% -improvement in size of the image by using "customization" vs. extension. The saving can increase in case you +<p>The above image is equivalent of the “extended” image from previous chapter but its size is only +874 MB. Comparing to 1.1 GB of the “extended image” this is about 230 MB less, so you can achieve ~20% +improvement in size of the image by using “customization” vs. extension. The saving can increase in case you have more complex dependencies to build.</p> </div> <div class="section" id="building-optimized-images"> <span id="image-build-optimized"></span><h3>Building optimized images<a class="headerlink" href="#building-optimized-images" title="Permalink to this headline">¶</a></h3> -<p>The following example the production image in version <code class="docutils literal notranslate"><span class="pre">3.6</span></code> with additional airflow extras from <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> +<p>The following example the production image in version <code class="docutils literal notranslate"><span class="pre">3.7</span></code> with additional airflow extras from <code class="docutils literal notranslate"><span class="pre">2.0.2</span></code> PyPI package but it includes additional apt dev and runtime dependencies.</p> <p>The dev dependencies are those that require <code class="docutils literal notranslate"><span class="pre">build-essential</span></code> and usually need to involve recompiling of some python dependencies so those packages might require some additional DEV dependencies to be present during recompilation. Those packages are not needed at runtime, so we only install them for the -"build" time. They are not installed in the final image, thus producing much smaller images. +“build” time. They are not installed in the final image, thus producing much smaller images. In this case pandas requires recompilation so it also needs gcc and g++ as dev APT dependencies. The <code class="docutils literal notranslate"><span class="pre">jre-headless</span></code> does not require recompiling so it can be installed as the runtime APT dependency.</p> <div class="example-block-wrapper docutils container"> @@ -1398,8 +1398,8 @@ docker build . <span class="se">\</span> <p>The following example builds the production image with default extras from the latest <code class="docutils literal notranslate"><span class="pre">v2-*-test</span></code> version and constraints are taken from the latest version of the <code class="docutils literal notranslate"><span class="pre">constraints-2-*</span></code> branch in GitHub (for example <code class="docutils literal notranslate"><span class="pre">v2-2-test</span></code> branch matches <code class="docutils literal notranslate"><span class="pre">constraints-2-2</span></code>). -Note that this command might fail occasionally as only the "released version" constraints when building a -version and "main" constraints when building main are guaranteed to work.</p> +Note that this command might fail occasionally as only the “released version” constraints when building a +version and “main” constraints when building main are guaranteed to work.</p> <div class="example-block-wrapper docutils container"> <p class="example-header"><span class="example-title">docs/docker-stack/docker-examples/customizing/github-v2-2-test.sh</span></p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nb">export</span> <span class="nv">DEBIAN_VERSION</span><span class="o">=</span><span class="s2">"bullseye"</span> @@ -1459,7 +1459,7 @@ are not present in the <code class="docutils literal notranslate"><span class="p <p>Similar results could be achieved by modifying the Dockerfile manually (see below) and injecting the commands needed, but by specifying the customizations via build-args, you avoid the need of synchronizing the changes from future Airflow Dockerfiles. Those customizations should work with the -future version of Airflow's official <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> at most with minimal modifications od parameter +future version of Airflow’s official <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> at most with minimal modifications od parameter names (if any), so using the build command for your customizations makes your custom image more future-proof.</p> </div> @@ -1522,15 +1522,15 @@ first download such constraint file locally and then use <code class="docutils l but in most likely scenario, those wheel files should be copied from an internal repository of such .whl files. Note that <code class="docutils literal notranslate"><span class="pre">AIRFLOW_VERSION_SPECIFICATION</span></code> is only there for reference, the apache airflow <code class="docutils literal notranslate"><span class="pre">.whl</span></code> file in the right version is part of the <code class="docutils literal notranslate"><span class="pre">.whl</span></code> files downloaded.</p> -<p>Note that 'pip download' will only works on Linux host as some of the packages need to be compiled from +<p>Note that ‘pip download’ will only works on Linux host as some of the packages need to be compiled from sources and you cannot install them providing <code class="docutils literal notranslate"><span class="pre">--platform</span></code> switch. They also need to be downloaded using the same python version as the target image.</p> <p>The <code class="docutils literal notranslate"><span class="pre">pip</span> <span class="pre">download</span></code> might happen in a separate environment. The files can be committed to a separate binary repository and vetted/verified by the security team and used subsequently to build images of Airflow when needed on an air-gaped system.</p> <p>Example of preparing the constraint files and wheel files. Note that <code class="docutils literal notranslate"><span class="pre">mysql</span></code> dependency is removed -as <code class="docutils literal notranslate"><span class="pre">mysqlclient</span></code> is installed from Oracle's <code class="docutils literal notranslate"><span class="pre">apt</span></code> repository and if you want to add it, you need -to provide this library from your repository if you want to build Airflow image in an "air-gaped" system.</p> +as <code class="docutils literal notranslate"><span class="pre">mysqlclient</span></code> is installed from Oracle’s <code class="docutils literal notranslate"><span class="pre">apt</span></code> repository and if you want to add it, you need +to provide this library from your repository if you want to build Airflow image in an “air-gaped” system.</p> <div class="example-block-wrapper docutils container"> <p class="example-header"><span class="example-title">docs/docker-stack/docker-examples/restricted/restricted_environments.sh</span></p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>mkdir -p docker-context-files @@ -1571,8 +1571,8 @@ client from the Microsoft repositories.</p></li> client from the Postgres repositories.</p></li> </ul> <p>Note, that the solution we have for installing python packages from local packages, only solves the problem -of "air-gaped" python installation. The Docker image also downloads <code class="docutils literal notranslate"><span class="pre">apt</span></code> dependencies and <code class="docutils literal notranslate"><span class="pre">node-modules</span></code>. -Those types of dependencies are however more likely to be available in your "air-gaped" system via transparent +of “air-gaped” python installation. The Docker image also downloads <code class="docutils literal notranslate"><span class="pre">apt</span></code> dependencies and <code class="docutils literal notranslate"><span class="pre">node-modules</span></code>. +Those types of dependencies are however more likely to be available in your “air-gaped” system via transparent proxies and it should automatically reach out to your private registries, however in the future the solution might be applied to both of those installation steps.</p> <p>You can also use techniques described in the previous chapter to make <code class="docutils literal notranslate"><span class="pre">docker</span> <span class="pre">build</span></code> use your private diff --git a/docs-archive/docker-stack/changelog.html b/docs-archive/docker-stack/changelog.html index e0095bcd3..afbd33fc8 100644 --- a/docs-archive/docker-stack/changelog.html +++ b/docs-archive/docker-stack/changelog.html @@ -591,7 +591,7 @@ we try to avoid those).</p> <ul class="simple"> <li><p>2.3.0</p> <ul> -<li><p>Airflow 2.3 <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> is now better optimized for caching and "standalone" which means that you +<li><p>Airflow 2.3 <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> is now better optimized for caching and “standalone” which means that you can copy <strong>just</strong> the <code class="docutils literal notranslate"><span class="pre">Dockerfile</span></code> to any folder and start building custom images. This however requires <a class="reference external" href="https://docs.docker.com/develop/develop-images/build_enhancements/">Buildkit</a> to build the image because we started using features that are only available in <code class="docutils literal notranslate"><span class="pre">Buildkit</span></code>. @@ -722,7 +722,7 @@ containing the new signing key.</p> <p>There were no changes in the behaviour of 2.0.2 image due to that Detailed <a class="reference external" href="https://github.com/apache/airflow/issues/20911">issue here</a> . Only 2.0.2 image was regenerated, as 2.0.1 and 2.0.0 versions are hardly used and it is unlikely someone -would like to extend those images. Extending 2.0.1 and 2.0.0 images will lead to failures of "missing key".</p> +would like to extend those images. Extending 2.0.1 and 2.0.0 images will lead to failures of “missing key”.</p> </li> <li><dl class="simple"> <dt>2.0.2</dt><dd><ul class="simple"> diff --git a/docs-archive/docker-stack/entrypoint.html b/docs-archive/docker-stack/entrypoint.html index ddd179a68..e98501137 100644 --- a/docs-archive/docker-stack/entrypoint.html +++ b/docs-archive/docker-stack/entrypoint.html @@ -580,7 +580,7 @@ <p>If you are using the default entrypoint of the production image, there are a few actions that are automatically performed when the container starts. In some cases, you can pass environment variables to the image to trigger some of that behaviour.</p> -<p>The variables that control the "execution" behaviour start with <code class="docutils literal notranslate"><span class="pre">_AIRFLOW</span></code> to distinguish them +<p>The variables that control the “execution” behaviour start with <code class="docutils literal notranslate"><span class="pre">_AIRFLOW</span></code> to distinguish them from the variables used to build the image starting with <code class="docutils literal notranslate"><span class="pre">AIRFLOW</span></code>.</p> <div class="section" id="allowing-arbitrary-user-to-run-the-container"> <span id="arbitrary-docker-user"></span><h2>Allowing arbitrary user to run the container<a class="headerlink" href="#allowing-arbitrary-user-to-run-the-container" title="Permalink to this headline">¶</a></h2> @@ -601,24 +601,24 @@ those formats (See <a class="reference external" href="https://docs.docker.com/e See <a class="reference external" href="https://docs.docker.com/compose/compose-file/compose-file-v3/#domainname-hostname-ipc-mac_address-privileged-read_only-shm_size-stdin_open-tty-user-working_dir">Docker compose reference</a> for details. In our Quickstart Guide using Docker-Compose, the UID can be passed via the <code class="docutils literal notranslate"><span class="pre">AIRFLOW_UID</span></code> variable as described in -<a class="reference external" href="/docs/apache-airflow/stable/start/docker.html#initializing-docker-compose-environment" title="(in apache-airflow v2.3.0.dev0)"><span class="xref std std-ref">Initializing docker compose environment</span></a>.</p> +<a class="reference external" href="/docs/apache-airflow/stable/start/docker.html#initializing-docker-compose-environment" title="(in apache-airflow v2.4.0.dev0)"><span class="xref std std-ref">Initializing docker compose environment</span></a>.</p> <p>The user can be any UID. In case UID is different than the default <code class="docutils literal notranslate"><span class="pre">airflow</span></code> (UID=50000), the user will be automatically created when entering the container.</p> <p>In order to accommodate a number of external libraries and projects, Airflow will automatically create -such an arbitrary user in (<cite>/etc/passwd</cite>) and make it's home directory point to <code class="docutils literal notranslate"><span class="pre">/home/airflow</span></code>. +such an arbitrary user in (<cite>/etc/passwd</cite>) and make it’s home directory point to <code class="docutils literal notranslate"><span class="pre">/home/airflow</span></code>. Many of 3rd-party libraries and packages require home directory of the user to be present, because they need to write some cache information there, so such a dynamic creation of a user is necessary.</p> <p>Such arbitrary user has to be able to write to certain directories that needs write access, and since -it is not advised to allow write access to "other" for security reasons, the OpenShift +it is not advised to allow write access to “other” for security reasons, the OpenShift guidelines introduced the concept of making all such folders have the <code class="docutils literal notranslate"><span class="pre">0</span></code> (<code class="docutils literal notranslate"><span class="pre">root</span></code>) group id (GID). All the directories that need write access in the Airflow production image have GID set to 0 (and they are writable for the group). We are following that concept and all the directories that need write access follow that.</p> <p>The GID=0 is set as default for the <code class="docutils literal notranslate"><span class="pre">airflow</span></code> user, so any directories it creates have GID set to 0 by default. The entrypoint sets <code class="docutils literal notranslate"><span class="pre">umask</span></code> to be <code class="docutils literal notranslate"><span class="pre">0002</span></code> - this means that any directories created by -the user have also "group write" access for group <code class="docutils literal notranslate"><span class="pre">0</span></code> - they will be writable by other users with -<code class="docutils literal notranslate"><span class="pre">root</span></code> group. Also whenever any "arbitrary" user creates a folder (for example in a mounted volume), that -folder will have a "group write" access and <code class="docutils literal notranslate"><span class="pre">GID=0</span></code>, so that execution with another, arbitrary user +the user have also “group write” access for group <code class="docutils literal notranslate"><span class="pre">0</span></code> - they will be writable by other users with +<code class="docutils literal notranslate"><span class="pre">root</span></code> group. Also whenever any “arbitrary” user creates a folder (for example in a mounted volume), that +folder will have a “group write” access and <code class="docutils literal notranslate"><span class="pre">GID=0</span></code>, so that execution with another, arbitrary user will still continue to work, even if such directory is mounted by another arbitrary user later.</p> <p>The <code class="docutils literal notranslate"><span class="pre">umask</span></code> setting however only works for runtime of the container - it is not used during building of the image. If you would like to extend the image and add your own packages, you should remember to add @@ -631,7 +631,7 @@ that need group access will also be writable for the group. This can be done for </pre></div> </div> </div></blockquote> -<p>You can read more about it in the "Support arbitrary user ids" chapter in the +<p>You can read more about it in the “Support arbitrary user ids” chapter in the <a class="reference external" href="https://docs.openshift.com/container-platform/4.7/openshift_images/create-images.html#images-create-guide-openshift_create-images">Openshift best practices</a>.</p> </div> <div class="section" id="waits-for-airflow-db-connection"> @@ -662,7 +662,7 @@ To disable check, set <code class="docutils literal notranslate"><span class="pr </div> <div class="section" id="executing-commands"> <span id="entrypoint-commands"></span><h2>Executing commands<a class="headerlink" href="#executing-commands" title="Permalink to this headline">¶</a></h2> -<p>If first argument equals to "bash" - you are dropped to a bash shell or you can executes bash command +<p>If first argument equals to “bash” - you are dropped to a bash shell or you can executes bash command if you specify extra arguments. For example:</p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>docker run -it apache/airflow:2.3.0-python3.6 bash -c <span class="s2">"ls -la"</span> total <span class="m">16</span> @@ -678,12 +678,12 @@ you pass extra parameters. For example:</p> <span class="nb">test</span> </pre></div> </div> -<p>If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command +<p>If first argument equals to “airflow” - the rest of the arguments is treated as an airflow command to execute. Example:</p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>docker run -it apache/airflow:2.3.0-python3.6 airflow webserver </pre></div> </div> -<p>If there are any other arguments - they are simply passed to the "airflow" command</p> +<p>If there are any other arguments - they are simply passed to the “airflow” command</p> <div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>> docker run -it apache/airflow:2.3.0-python3.6 <span class="nb">help</span> usage: airflow <span class="o">[</span>-h<span class="o">]</span> GROUP_OR_COMMAND ... @@ -724,10 +724,10 @@ to execute. Example:</p> </div> <div class="section" id="execute-custom-code-before-the-airflow-entrypoint"> <h2>Execute custom code before the Airflow entrypoint<a class="headerlink" href="#execute-custom-code-before-the-airflow-entrypoint" title="Permalink to this headline">¶</a></h2> -<p>If you want to execute some custom code before Airflow's entrypoint you can by using -a custom script and calling Airflow's entrypoint as the +<p>If you want to execute some custom code before Airflow’s entrypoint you can by using +a custom script and calling Airflow’s entrypoint as the last <code class="docutils literal notranslate"><span class="pre">exec</span></code> instruction in your custom one. However you have to remember to use <code class="docutils literal notranslate"><span class="pre">dumb-init</span></code> in the same -way as it is used with Airflow's entrypoint, otherwise you might have problems with proper signal +way as it is used with Airflow’s entrypoint, otherwise you might have problems with proper signal propagation (See the next chapter).</p> <div class="highlight-Dockerfile notranslate"><div class="highlight"><pre><span></span><span class="k">FROM</span><span class="w"> </span><span class="s">airflow:2.3.0</span> <span class="k">COPY</span><span class="w"> </span>my_entrypoint.sh / @@ -743,13 +743,13 @@ execution (A bit useless example but should give the reader an example of how yo <span class="nb">exec</span> /entrypoint <span class="s2">"</span><span class="si">${</span><span class="p">@</span><span class="si">}</span><span class="s2">"</span> </pre></div> </div> -<p>Make sure Airflow's entrypoint is run with <code class="docutils literal notranslate"><span class="pre">exec</span> <span class="pre">/entrypoint</span> <span class="pre">"${@}"</span></code> as the last command in your +<p>Make sure Airflow’s entrypoint is run with <code class="docutils literal notranslate"><span class="pre">exec</span> <span class="pre">/entrypoint</span> <span class="pre">"${@}"</span></code> as the last command in your custom entrypoint. This way signals will be properly propagated and arguments will be passed to the entrypoint as usual (you can use <code class="docutils literal notranslate"><span class="pre">shift</span></code> as above if you need to pass some extra arguments. Note that passing secret values this way or storing secrets inside the image is a bad idea from security point of view - as both image and parameters to run the image with are accessible to anyone who has access to logs of your Kubernetes or image registry.</p> -<p>Also be aware that code executed before Airflow's entrypoint should not create any files or +<p>Also be aware that code executed before Airflow’s entrypoint should not create any files or directories inside the container and everything might not work the same way when it is executed. Before Airflow entrypoint is executed, the following functionalities are not available:</p> <ul class="simple"> @@ -779,7 +779,7 @@ docker run -it my-image:0.0.1 bash -c <span class="s2">"/my_after_entrypoin </div> <div class="section" id="signal-propagation"> <h2>Signal propagation<a class="headerlink" href="#signal-propagation" title="Permalink to this headline">¶</a></h2> -<p>Airflow uses <code class="docutils literal notranslate"><span class="pre">dumb-init</span></code> to run as "init" in the entrypoint. This is in order to propagate +<p>Airflow uses <code class="docutils literal notranslate"><span class="pre">dumb-init</span></code> to run as “init” in the entrypoint. This is in order to propagate signals and reap child processes properly. This means that the process that you run does not have to install signal handlers to work properly and be killed when the container is gracefully terminated. The behaviour of signal propagation is configured by <code class="docutils literal notranslate"><span class="pre">DUMB_INIT_SETSID</span></code> variable which is set to @@ -811,10 +811,10 @@ A good example is warm shutdown of Celery workers. The <code class="docutils lit in this case will only propagate the signals to the main process, but not to the processes that are spawned in the same process group as the main one. For example in case of Celery, the main -process will put the worker in "offline" mode, and will wait +process will put the worker in “offline” mode, and will wait until all running tasks complete, and only then it will terminate all processes.</p> -<p>For Airflow's Celery worker, you should set the variable to 0 +<p>For Airflow’s Celery worker, you should set the variable to 0 and either use <code class="docutils literal notranslate"><span class="pre">["celery",</span> <span class="pre">"worker"]</span></code> command. If you are running it through <code class="docutils literal notranslate"><span class="pre">["bash",</span> <span class="pre">"-c"]</span></code> command, you need to start the worker via <code class="docutils literal notranslate"><span class="pre">exec</span> <span class="pre">airflow</span> <span class="pre">celery</span> <span class="pre">worker</span></code> @@ -848,7 +848,7 @@ comes to concurrency.</p> production, it is only useful if you would like to run a quick test with the production image. You need to pass at least password to create such user via <code class="docutils literal notranslate"><span class="pre">_AIRFLOW_WWW_USER_PASSWORD</span></code> or <span class="target" id="index-6"></span><code class="xref std std-envvar docutils literal notranslate"><span class="pre">_AIRFLOW_WWW_USER_PASSWORD_CMD</span></code> similarly like for other <code class="docutils literal notranslate"><span class="pre">*_CMD</span></code> variables, the content of -the <code class="docutils literal notranslate"><span class="pre">*_CMD</span></code> will be evaluated as shell command and it's output will be set as password.</p> +the <code class="docutils literal notranslate"><span class="pre">*_CMD</span></code> will be evaluated as shell command and it’s output will be set as password.</p> <p>User creation will fail if none of the <code class="docutils literal notranslate"><span class="pre">PASSWORD</span></code> variables are set - there is no default for password for security reasons.</p> <table class="docutils align-default"> diff --git a/docs-archive/docker-stack/index.html b/docs-archive/docker-stack/index.html index b511e2afa..1a3950525 100644 --- a/docs-archive/docker-stack/index.html +++ b/docs-archive/docker-stack/index.html @@ -564,11 +564,11 @@ for all the supported Python versions.</p> <li><p><code class="code docutils literal notranslate"><span class="pre">apache/airflow:2.3.0</span></code> - the versioned Airflow image with default Python version (3.7 currently)</p></li> <li><p><code class="code docutils literal notranslate"><span class="pre">apache/airflow:2.3.0-pythonX.Y</span></code> - the versioned Airflow image with specific Python version</p></li> </ul> -<p>Those are "reference" images. They contain the most common set of extras, dependencies and providers that are -often used by the users and they are good to "try-things-out" when you want to just take Airflow for a spin,</p> +<p>Those are “reference” images. They contain the most common set of extras, dependencies and providers that are +often used by the users and they are good to “try-things-out” when you want to just take Airflow for a spin,</p> <p>The Apache Airflow image provided as convenience package is optimized for size, and it provides just a bare minimal set of the extras and dependencies installed and in most cases -you want to either extend or customize the image. You can see all possible extras in <a class="reference external" href="/docs/apache-airflow/stable/extra-packages-ref.html" title="(in apache-airflow v2.3.0.dev0)"><span>Reference for package extras</span></a>. +you want to either extend or customize the image. You can see all possible extras in <a class="reference external" href="/docs/apache-airflow/stable/extra-packages-ref.html" title="(in apache-airflow v2.4.0.dev0)"><span>Reference for package extras</span></a>. The set of extras used in Airflow Production image are available in the <a class="reference external" href="https://github.com/apache/airflow/blob/2c6c7fdb2308de98e142618836bdf414df9768c8/Dockerfile#L37">Dockerfile</a>.</p> <p>However, Airflow has more than 60 community-managed providers (installable via extras) and some of the @@ -582,14 +582,14 @@ for details.</p> </div> <div class="section" id="usage"> <h1>Usage<a class="headerlink" href="#usage" title="Permalink to this headline">¶</a></h1> -<p>The <span class="target" id="index-0"></span><a class="reference external" href="/docs/apache-airflow/stable/cli-and-env-variables-ref.html#envvar-AIRFLOW_HOME" title="(in apache-airflow v2.3.0.dev0)"><code class="xref std std-envvar docutils literal notranslate"><span class="pre">AIRFLOW_HOME</span></code></a> is set by default to <code class="docutils literal notranslate"><span class="pre">/opt/airflow/</span></code> - this means that DAGs +<p>The <span class="target" id="index-0"></span><a class="reference external" href="/docs/apache-airflow/stable/cli-and-env-variables-ref.html#envvar-AIRFLOW_HOME" title="(in apache-airflow v2.4.0.dev0)"><code class="xref std std-envvar docutils literal notranslate"><span class="pre">AIRFLOW_HOME</span></code></a> is set by default to <code class="docutils literal notranslate"><span class="pre">/opt/airflow/</span></code> - this means that DAGs are in default in the <code class="docutils literal notranslate"><span class="pre">/opt/airflow/dags</span></code> folder and logs are in the <code class="docutils literal notranslate"><span class="pre">/opt/airflow/logs</span></code></p> <p>The working directory is <code class="docutils literal notranslate"><span class="pre">/opt/airflow</span></code> by default.</p> <p>If no <span class="target" id="index-1"></span><code class="xref std std-envvar docutils literal notranslate"><span class="pre">AIRFLOW__DATABASE__SQL_ALCHEMY_CONN</span></code> variable is set then SQLite database is created in <code class="docutils literal notranslate"><span class="pre">${AIRFLOW_HOME}/airflow.db</span></code>.</p> <p>For example commands that start Airflow see: <a class="reference internal" href="entrypoint.html#entrypoint-commands"><span class="std std-ref">Executing commands</span></a>.</p> <p>Airflow requires many components to function as it is a distributed application. You may therefore also be interested -in launching Airflow in the Docker Compose environment, see: <a class="reference external" href="/docs/apache-airflow/stable/start/index.html" title="(in apache-airflow v2.3.0.dev0)"><span>Quick Start</span></a>.</p> +in launching Airflow in the Docker Compose environment, see: <a class="reference external" href="/docs/apache-airflow/stable/start/index.html" title="(in apache-airflow v2.4.0.dev0)"><span>Quick Start</span></a>.</p> <p>You can use this image in <a class="reference external" href="/docs/helm-chart/stable/index.html" title="(in helm-chart v1.6.0-dev)"><span class="xref std std-doc">Helm Chart</span></a> as well.</p> </div> diff --git a/docs-archive/docker-stack/searchindex.js b/docs-archive/docker-stack/searchindex.js index e8ea07905..443c3e59e 100644 --- a/docs-archive/docker-stack/searchindex.js +++ b/docs-archive/docker-stack/searchindex.js @@ -1 +1 @@ -Search.setIndex({docnames:["build","build-arg-ref","changelog","entrypoint","index","recipes"],envversion:{"sphinx.domains.c":2,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":5,"sphinx.domains.index":1,"sphinx.domains.javascript":2,"sphinx.domains.math":2,"sphinx.domains.python":3,"sphinx.domains.rst":2,"sphinx.domains.std":2,"sphinx.ext.intersphinx":1,"sphinx.ext.viewcode":1,sphinx:56},filenames:["build.rst","build-arg-ref.rst","changelog.rst","entrypoint. [...] \ No newline at end of file +Search.setIndex({docnames:["build","build-arg-ref","changelog","entrypoint","index","recipes"],envversion:{"sphinx.domains.c":2,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":5,"sphinx.domains.index":1,"sphinx.domains.javascript":2,"sphinx.domains.math":2,"sphinx.domains.python":3,"sphinx.domains.rst":2,"sphinx.domains.std":2,"sphinx.ext.intersphinx":1,"sphinx.ext.viewcode":1,sphinx:56},filenames:["build.rst","build-arg-ref.rst","changelog.rst","entrypoint. [...] \ No newline at end of file