potiuk commented on a change in pull request #18282:
URL: https://github.com/apache/airflow/pull/18282#discussion_r710157759



##########
File path: docs/apache-airflow/installation/index.rst
##########
@@ -0,0 +1,335 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Installation
+------------
+
+.. contents:: :local:
+
+.. toctree::
+    :maxdepth: 1
+    :caption: Installation
+    :hidden:
+
+    Prerequisites <prerequisites>
+    Dependencies <dependencies>
+    Supported versions <supported-versions>
+    Installing from sources <installing-from-sources>
+    Installing from PyPI <installing-from-pypi>
+    Setting up the database <setting-up-the-database>
+    Upgrading to Airflow 2 <upgrading-to-2>
+    Upgrade Check <upgrade-check>
+
+This page describes installations options that you might use when considering 
how to install Airflow.
+Airflow consists of many components, often distributed among many physical or 
virtual machines, therefore
+installation of Airflow might be quite complex, depending on the options you 
choose.
+
+You should also check-out the :doc:`Prerequisites <prerequisites>` that must 
be fulfilled when installing Airflow
+as well as  :doc:`Supported versions <supported-versions>` to know what are 
the policies for supporting
+Airflow, Python and Kubernetes.
+
+Airflow requires additional :doc:`Dependencies <dependencies>` to be installed 
- which can be done
+via extras and providers.
+
+When you install Airflow, you need to :doc:`setup the database 
<setting-up-the-database>` which must be
+kept updated when airflow is upgraded.
+
+If you came here to upgrade Airflow from 1.10 version (which is end-of-life 
already), follow the
+:doc:`Upgrading to Airflow 2 <upgrading-to-2>`.
+
+Using released sources
+''''''''''''''''''''''
+
+More details: :doc:`installing-from-sources`
+
+**When this option works best**
+
+This option is best if you expect to build all your software from sources.
+Apache Airflow is one of the projects that belong to the `Apache Software 
Foundation <https://www.apache.org/>`__ .
+It is a requirement for all ASF projects that they can be installed using 
official sources released via `Official Apache Mirrors 
<http://ws.apache.org/mirrors.cgi/>`__ .
+You can easily `Verify the integrity and provenance of the software 
<https://www.apache.org/dyn/closer.cgi#verify>`__
+
+**Intended users**
+
+Users who are familiar with installing and building software from sources and 
are conscious about integrity and provenance
+of the software they use down to the lowest level possible.
+
+**What are you expected to handle**
+
+You are expected to build and install airflow and its components on your own.
+You should develop and handle the deployment for all components of Airflow.
+
+You are responsible for setting up database, creating and managing database 
schema with ``airflow db`` commands,
+automated startup and recovery, maintenance, cleanup and upgrades of Airflow 
and the Airflow Providers.
+
+**What Apache Airflow Community provides for that method**
+
+You have `instructions 
<https://github.com/apache/airflow/blob/main/INSTALL>`__ on how to build the 
software but due to various environments
+and tools you might want to use, you might expect that there will be problems 
which are specific to your deployment and environment
+you will have to diagnose and solve.
+
+**Where to ask for help**
+
+The ``#development`` slack channel for building the software.
+
+The ``#troubleshooting`` slack is a channel for quick general troubleshooting 
questions. The
+`GitHub discussions <https://github.com/apache/airflow/discussions>`__ if you 
look for longer discussion and have more information to share.
+
+If you can provide description of a reproducible problem with Airflow 
software, you can open issue at `GitHub issues 
<https://github.com/apache/airflow/issues>`_
+
+Using PyPI
+'''''''''''
+
+More details:  :doc:`/installation/installing-from-pypi`
+
+**When this option works best**
+
+This installation method is useful when you are not familiar with Containers 
and Docker and want to install
+Apache Airflow on physical or virtual machines and you are used to installing 
and running software using custom
+deployment mechanism.
+
+The only officially supported mechanism of installation is via ``pip`` using 
constraint mechanisms. The constraint
+files are managed by Apache Airflow release managers to make sure that you can 
repeatably install Airflow from PyPI with all Providers and
+required dependencies.
+
+You can also verify integrity and provenance of the packages of the packages 
downloaded from PyPI as described at the installation
+page, but software you download from PyPI is pre-built for you so that you can 
install it without building.
+
+**Intended users**
+
+Users who are familiar with installing and configuring Python applications, 
managing Python
+environments, dependencies and running software with their custom deployment 
mechanisms.
+
+**What are you expected to handle**
+
+You are expected to install Airflow - all components of it - on your own.
+You should develop and handle the deployment for all components of Airflow.
+
+You are responsible for setting up database, creating and managing database 
schema with ``airflow db`` commands,
+automated startup and recovery, maintenance, cleanup and upgrades of Airflow 
and Airflow Providers.
+
+**What Apache Airflow Community provides for that method**
+
+You have `instructions 
<https://airflow.apache.org/docs/apache-airflow/stable/installation.html>`__
+on how to install the software but due to various environments and tools you 
might want to use, you might expect that there will be
+problems which are specific to your deployment and environment you will have 
to diagnose and solve.
+
+**Where to ask for help**
+
+The ``#troubleshooting`` channel on Airflow Slack for quick general
+troubleshooting questions. The `GitHub discussions 
<https://github.com/apache/airflow/discussions>`__
+if you look for longer discussion and have more information to share.
+
+If you can provide description of a reproducible problem with Airflow 
software, you can open
+issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
+
+
+Using Production Docker Images
+''''''''''''''''''''''''''''''
+
+More details: :doc:`docker-stack:index`
+
+**When this option works best**
+
+This installation method is useful when you are familiar with Container/Docker 
stack. It provides a capability of
+running Airflow components in isolation from other software running on the 
same physical or virtual machines with easy
+maintenance of dependencies.
+
+The images are build by Apache Airflow release managers and they use 
officially released packages from PyPI and official constraint files
+- same that are used for installing airflow from PyPI.
+
+**Intended users**
+
+Users who are familiar with Containers and Docker stack and understand how to 
build their own container images.
+
+Users who understand how to install providers and dependencies from PyPI with 
constraints if they want to extend or customize the image.
+
+Users who now how to create deployments using Docker by linking together 
multiple docker containers and maintaining such deployments.
+
+**What are you expected to handle**
+
+You are expected to be able to customize or extend Container/Docker images if 
you want to
+add extra dependencies. You are expected to put together a deployment built of 
several containers
+(for example using docker-compose) and to make sure that they are linked 
together.
+
+You are responsible for setting up database, creating and managing database 
schema with ``airflow db`` commands,
+automated startup and recovery, maintenance, cleanup and upgrades of Airflow 
and the Airflow Providers.
+
+With the Official Airflow Docker Images, upgrades of Airflow and Airflow 
Providers which
+are part of the reference image are handled by the community - you need to 
make sure to pick up
+those changes when released by upgrading the base image. However you are 
responsible in creating a
+pipeline of building your own custom images with your own added dependencies 
and Providers and need to
+repeat the customization step and building your own image when new version of 
Airflow image is released.
+
+You might also want to upgrade your image independently with new Providers 
when they are released
+independently from releasing the Airflow image.
+
+**What Apache Airflow Community provides for that method**
+
+You have instructions: :doc:`docker-stack:build` on how to build and customize 
your image.
+
+You also have `quick start guide </start/index>`_ where you can see an example 
of Quick Start docker-compose which
+you can use to start Airflow quickly for local testing and development. 
However this is just an inspiration.
+Do not expect this docker-compose is ready for production installation, you 
need to get familiar
+with docker-compose and its capabilities and build your own production-ready 
deployment.
+
+Docker-compose is only one of the available options of deployments of 
containers. You can use

Review comment:
       Rephrased.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to