Hey,
My airflow_logging_settings.py is attached. (is the same as the
airflow_local_settings.py you mention - i think i saw both and
'airflow_local_settings.py' and an 'airflow_logging_settings.py' referenced
in two separate places although i think it's pretty much the same thing,
just slightly different filenames. I think one is mentioned in the
updating.md file and the other is what the file is actually called in the
github repo. Am guessing does not matter so long as i refrence the correct
file in airflow.cfg).
I had not renamed DEFAULT_LOGGING_CONFIG to LOGGING_CONFIG in my
airflow_logging_settings.py. I did and uncommented the class line in my
airflow.cfg and still got below error:
andrew_maguire@airflow-server:~$ airflow list_dags
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 16, in <module>
from airflow import configuration
File "/usr/local/lib/python2.7/dist-packages/airflow/__init__.py",
line 31, in <module>
from airflow import settings
File "/usr/local/lib/python2.7/dist-packages/airflow/settings.py",
line 148, in <module>
configure_logging()
File "/usr/local/lib/python2.7/dist-packages/airflow/logging_config.py",
line 47, in configure_logging
'Unable to load custom logging from {}'.format(logging_class_path)
ImportError: Unable to load custom logging from
plugins.airflow_logging_settings.LOGGING_CONFIG
So don't think my error even got that far.
Happy to help debug or test or be of any use.
Cheers,
Andy
On Tue, Oct 17, 2017 at 11:43 AM Driesprong, Fokko <[email protected]>
wrote:
> Hi Andy,
>
> Thanks for reaching out. We are debugging the new logging, and input from
> the community is highly appreciated.
>
> If you are using Python 2, you'll need to put an empty __init__.py in all
> the directory, so ~/airflow/plugins/__init__.py, this needs to be empty.
> Could you share your airflow_local_settings.py? If there are any GCS
> credentials, please remove them. Please check that you've renamed
> the DEFAULT_LOGGING_CONFIG variable to LOGGING_CONFIG, this might not be
> evident from the updating.md.
>
> Cheers, Fokko
>
> 2017-10-17 12:00 GMT+02:00 Andrew Maguire <[email protected]>:
>
> > Hi,
> >
> > I've updated to 1.9 but am having trouble setting the
> logging_config_class
> > class path in the airflow.cfg file.
> >
> > Currently i have below files in {AIRFLOW_HOME}/plugins
> >
> > [image: image.png]
> > Where airflow_logging_settings.py is just a copy of this file
> > <
> https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py
> >
> > but with the GCS stuff uncommented and a line added for GCS_LOG_FOLDER to
> > be pulled from airflow.cfg just like BASE_LOG_FOLDER
> >
> > then in my {AIRFLOW_HOME}/airflow.cfg file i have this following line to
> > set up the log stuff:
> >
> > # The folder where airflow should store its log files
> > # This path must be absolute
> > base_log_folder = {AIRFLOW_HOME}/logs
> >
> > gcs_log_folder = gs://pmc-airflow/logs
> >
> > # Airflow can store logs remotely in AWS S3 or Google Cloud Storage.
> Users
> > # must supply an Airflow connection id that provides access to the
> storage
> > # location.
> > remote_log_conn_id = my_gcp_connection
> > encrypt_s3_logs = False
> >
> > # Logging level
> > logging_level = INFO
> >
> > # Logging class
> > # Specify the class that will specify the logging configuration
> > # This class has to be on the python classpath
> > logging_config_class = plugins.airflow_logging_settings.LOGGING_CONFIG
> >
> > # Log format
> > log_format = [%%(asctime)s] {{%%(filename)s:%%(lineno)d}} %%(levelname)s
> - %%(message)s
> > simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
> >
> >
> > However if i run "airflow list_dags" i now get:
> >
> > andrew_maguire@airflow-server:~/airflow$ airflow list_dags
> > Traceback (most recent call last):
> > File "/usr/local/bin/airflow", line 16, in <module>
> > from airflow import configuration
> > File "/usr/local/lib/python2.7/dist-packages/airflow/__init__.py",
> line 31, in <module>
> > from airflow import settings
> > File "/usr/local/lib/python2.7/dist-packages/airflow/settings.py",
> line 148, in <module>
> > configure_logging()
> > File
> "/usr/local/lib/python2.7/dist-packages/airflow/logging_config.py", line
> 47, in configure_logging
> > 'Unable to load custom logging from {}'.format(logging_class_path)
> > ImportError: Unable to load custom logging from
> plugins.airflow_logging_settings.LOGGING_CONFIG
> >
> > If i go back into my airflow.cfg and comment out the line:
> >
> > logging_config_class = plugins.airflow_logging_settings.LOGGING_CONFIG
> >
> > Things work again but i'm only doing local logging.
> >
> > So am sure i'm doing something wrong here in how i'm setting that line
> in the airflow.cfg file.
> >
> > So what i did was.
> >
> > 1. create a folder {AIRFLOW_HOME}/plugins - i did this as updating.md <
> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md>
> mentioned that "The logging configuration file that contains the
> configuration needs te on the the PYTHONPATH, for example in ~/airflow/dags
> or ~/airflow/plugins. These directories are loaded by default".
> >
> > 2. copy this file <
> https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py>
> into {AIRFLOW_HOME}/plugins with the GCS changes i mentioned.
> >
> > 3. create an __init__.py file in {AIRFLOW_HOME}/plugins - do i need to
> put anyting in partiucular in here?
> >
> > 4. update {AIRFLOW_HOME}/airflow.cfg as above.
> >
> > Can someone help me figure out where i went wrong?
> >
> > I'm hesitant to change or add anytihng to the pythonpath as an not 100%
> sure what i'm doing. So was hoping to just drop the logging config file
> somewhere it would automatically be picked up. And i'm also not really sure
> about python packages and class paths etc so kinda feeling my way through
> it but not confident.
> >
> > Cheers
> >
> > Andy
> >
> >
> >
>
# -*- coding: utf-8 -*-
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from airflow import configuration as conf
# TODO: Logging format and level should be configured
# in this file instead of from airflow.cfg. Currently
# there are other log format and level configurations in
# settings.py and cli.py. Please see AIRFLOW-1455.
LOG_LEVEL = conf.get('core', 'LOGGING_LEVEL').upper()
LOG_FORMAT = conf.get('core', 'log_format')
BASE_LOG_FOLDER = conf.get('core', 'BASE_LOG_FOLDER')
GCS_LOG_FOLDER = conf.get('core', 'GCS_LOG_FOLDER')
FILENAME_TEMPLATE = '{{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number
}}.log'
LOGGING_CONFIG = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'airflow.task': {
'format': LOG_FORMAT,
},
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'airflow.task',
'stream': 'ext://sys.stdout'
},
'file.task': {
'class': 'airflow.utils.log.file_task_handler.FileTaskHandler',
'formatter': 'airflow.task',
'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
'filename_template': FILENAME_TEMPLATE,
},
# When using s3 or gcs, provide a customized LOGGING_CONFIG
# in airflow_local_settings within your PYTHONPATH, see UPDATING.md
# for details
# 's3.task': {
# 'class': 'airflow.utils.log.s3_task_handler.S3TaskHandler',
# 'formatter': 'airflow.task',
# 'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
# 's3_log_folder': S3_LOG_FOLDER,
# 'filename_template': FILENAME_TEMPLATE,
# },
'gcs.task': {
'class': 'airflow.utils.log.gcs_task_handler.GCSTaskHandler',
'formatter': 'airflow.task',
'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
'gcs_log_folder': GCS_LOG_FOLDER,
'filename_template': FILENAME_TEMPLATE,
},
},
'loggers': {
'airflow.task': {
'handlers': ['file.task'],
'level': LOG_LEVEL,
'propagate': False,
},
'airflow.task_runner': {
'handlers': ['file.task'],
'level': LOG_LEVEL,
'propagate': True,
},
'airflow': {
'handlers': ['console'],
'level': LOG_LEVEL,
'propagate': False,
},
}
}