Taragolis commented on code in PR #34964:
URL: https://github.com/apache/airflow/pull/34964#discussion_r1369857493


##########
docs/apache-airflow/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst:
##########
@@ -89,3 +94,63 @@ See :doc:`../modules_management` for details on how Python 
and Airflow manage mo
 .. note::
 
    You can override the way both standard logs of the components and "task" 
logs are handled.
+
+
+Custom logger for Operators, Hooks and Tasks
+--------------------------------------------
+
+You can create custom logging handlers and apply them to specific Operators, 
Hooks and tasks. By default, the Operators
+and Hooks loggers are child of the ``airflow.task`` logger: They follow 
respectively the naming convention
+``airflow.task.operators.<package>.<module_name>`` and 
``airflow.task.hooks.<package>.<module_name>``. After
+:doc:`creating a custom logging class 
</administration-and-deployment/logging-monitoring/advanced-logging-configuration>`,
+you can assign specific loggers to them.
+
+Example of custom logging for the ``SQLExecuteQueryOperator`` and the 
``HttpHook``:
+
+    .. code-block:: python
+
+      from copy import deepcopy
+      from pydantic.utils import deep_update

Review Comment:
   Seems like it unused. 
   
   And why we need `pydantic` here?



##########
docs/apache-airflow/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst:
##########
@@ -89,3 +94,63 @@ See :doc:`../modules_management` for details on how Python 
and Airflow manage mo
 .. note::
 
    You can override the way both standard logs of the components and "task" 
logs are handled.
+
+
+Custom logger for Operators, Hooks and Tasks
+--------------------------------------------
+
+You can create custom logging handlers and apply them to specific Operators, 
Hooks and tasks. By default, the Operators
+and Hooks loggers are child of the ``airflow.task`` logger: They follow 
respectively the naming convention
+``airflow.task.operators.<package>.<module_name>`` and 
``airflow.task.hooks.<package>.<module_name>``. After
+:doc:`creating a custom logging class 
</administration-and-deployment/logging-monitoring/advanced-logging-configuration>`,
+you can assign specific loggers to them.
+
+Example of custom logging for the ``SQLExecuteQueryOperator`` and the 
``HttpHook``:
+
+    .. code-block:: python
+
+      from copy import deepcopy
+      from pydantic.utils import deep_update
+      from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
+
+      LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
+      LOGGING_CONFIG.deep_update(

Review Comment:
   This would raise
   
   ```console
   AttributeError: 'dict' object has no attribute 'deep_update'
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to