[jira] [Created] (AIRFLOW-6748) Tasks fail to run when alternate AIRFLOW_HOME is defined.
Evan Hlavaty created AIRFLOW-6748: - Summary: Tasks fail to run when alternate AIRFLOW_HOME is defined. Key: AIRFLOW-6748 URL: https://issues.apache.org/jira/browse/AIRFLOW-6748 Project: Apache Airflow Issue Type: Bug Components: configuration, DagRun Affects Versions: 1.10.7 Environment: ubuntu 16.04 Reporter: Evan Hlavaty When an alternate AIRFLOW_HOME is defined via env var and a DAG is triggered either by a schedule or manually triggered the tasks fail to execute. This occurs because the alternate AIRFLOW_HOME is ignored and a default airflow.cfg is generated at execution time in ~/airflow. Which contains all the incorrect settings, causing the DAG to fail. *My setup:* DaskExecutor Postgres Metadata DB S3 remote logging *What I see in my dask worker log:* SequentialExecutor being used Failed to write logs to SQLITE *Work around:* Copying alternate $AIRFLOW_HOME/airflow.cfg to ~/airflow/airflow.cfg This insures a default airflow.cfg is not generated in ~/airflow with incorrect settings... My DAG then runs successfully -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-6633) S3 Logging Configurations Are Ignored For Local
[ https://issues.apache.org/jira/browse/AIRFLOW-6633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Evan Hlavaty closed AIRFLOW-6633. - Resolution: Not A Problem Finally found logs in the dask worker and found out s3 url needed https:// prepended to it > S3 Logging Configurations Are Ignored For Local > --- > > Key: AIRFLOW-6633 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6633 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: 1.10.7 > Environment: Ubuntu 16.04 LTS >Reporter: Evan Hlavaty >Priority: Major > > When using the following config settings and S3 connection ID created in > Admin UI, Local logs are still being used and no logs are uploaded to S3. No > errors are ever thrown to indicate if connection settings are working. > [core] # Airflow can store logs remotely in AWS S3. Users must supply a > remote # location URL (starting with either 's3://...') and an Airflow > connection # id that provides access to the storage location. > remote_logging = True > remote_base_log_folder = s3://my-bucket/path/to/logs > remote_log_conn_id = MyS3Conn > encrypt_s3_logs = False -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6633) S3 Logging Configurations Are Ignored For Local
[ https://issues.apache.org/jira/browse/AIRFLOW-6633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Evan Hlavaty updated AIRFLOW-6633: -- Description: When using the following config settings and S3 connection ID created in Admin UI, Local logs are still being used and no logs are uploaded to S3. No errors are ever thrown to indicate if connection settings are working. [core] # Airflow can store logs remotely in AWS S3. Users must supply a remote # location URL (starting with either 's3://...') and an Airflow connection # id that provides access to the storage location. remote_logging = True remote_base_log_folder = s3://my-bucket/path/to/logs remote_log_conn_id = MyS3Conn encrypt_s3_logs = False was: When using the following config settings and S3 connection ID created in Admin UI, Local logs are still being used and no logs are uploaded to S3. No errors are ever thrown to indicate if connection settings are working. [core] # Airflow can store logs remotely in AWS S3. Users must supply a remote # location URL (starting with either 's3://...') and an Airflow connection # id that provides access to the storage location. remote_logging = True remote_base_log_folder = s3://my-bucket/path/to/logs remote_log_conn_id = MyS3Conn # Use server-side encryption for logs stored in S3 encrypt_s3_logs = False > S3 Logging Configurations Are Ignored For Local > --- > > Key: AIRFLOW-6633 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6633 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: 1.10.7 > Environment: Ubuntu 16.04 LTS >Reporter: Evan Hlavaty >Priority: Major > > When using the following config settings and S3 connection ID created in > Admin UI, Local logs are still being used and no logs are uploaded to S3. No > errors are ever thrown to indicate if connection settings are working. > [core] # Airflow can store logs remotely in AWS S3. Users must supply a > remote # location URL (starting with either 's3://...') and an Airflow > connection # id that provides access to the storage location. > remote_logging = True > remote_base_log_folder = s3://my-bucket/path/to/logs > remote_log_conn_id = MyS3Conn > encrypt_s3_logs = False -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6633) S3 Logging Configurations Are Ignored For Local
Evan Hlavaty created AIRFLOW-6633: - Summary: S3 Logging Configurations Are Ignored For Local Key: AIRFLOW-6633 URL: https://issues.apache.org/jira/browse/AIRFLOW-6633 Project: Apache Airflow Issue Type: Bug Components: logging Affects Versions: 1.10.7 Environment: Ubuntu 16.04 LTS Reporter: Evan Hlavaty When using the following config settings and S3 connection ID created in Admin UI, Local logs are still being used and no logs are uploaded to S3. No errors are ever thrown to indicate if connection settings are working. [core] # Airflow can store logs remotely in AWS S3. Users must supply a remote # location URL (starting with either 's3://...') and an Airflow connection # id that provides access to the storage location. remote_logging = True remote_base_log_folder = s3://my-bucket/path/to/logs remote_log_conn_id = MyS3Conn # Use server-side encryption for logs stored in S3 encrypt_s3_logs = False -- This message was sent by Atlassian Jira (v8.3.4#803005)