robinedwards opened a new issue #18118:
URL: https://github.com/apache/airflow/issues/18118


   ### Apache Airflow version
   
   2.1.3 (latest released)
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon @ 
file:///root/.cache/pypoetry/artifacts/7f/f7/23/fc7fd3543aa486275ef0385c29063ff0dc391b0fc95dc5aa6cab2cf4e5/apache_airflow_providers_amazon-2.2.0-py3-none-any.whl
   apache-airflow-providers-celery @ 
file:///root/.cache/pypoetry/artifacts/14/80/39/0d9d57205da1d24189ac9c18eb3477664ed2c2618c1467c9809b9a2fbf/apache_airflow_providers_celery-2.0.0-py3-none-any.whl
   apache-airflow-providers-ftp @ 
file:///root/.cache/pypoetry/artifacts/a5/13/da/bf14abc40193a1ee1b82bbd800e3ac230427d7684b9d40998ac3684bef/apache_airflow_providers_ftp-2.0.1-py3-none-any.whl
   apache-airflow-providers-http @ 
file:///root/.cache/pypoetry/artifacts/fc/d7/d2/73c89ef847bbae1704fa403d7e92dba1feead757aae141613980db40ff/apache_airflow_providers_http-2.0.0-py3-none-any.whl
   apache-airflow-providers-imap @ 
file:///root/.cache/pypoetry/artifacts/af/5d/de/21c10bfc7ac076a415dcc3fc909317547e77e38c005487552cf40ddd97/apache_airflow_providers_imap-2.0.1-py3-none-any.whl
   apache-airflow-providers-postgres @ 
file:///root/.cache/pypoetry/artifacts/50/27/e0/9b0d8f4c0abf59967bb87a04a93d73896d9a4558994185dd8bc43bb67f/apache_airflow_providers_postgres-2.2.0-py3-none-any.whl
   apache-airflow-providers-redis @ 
file:///root/.cache/pypoetry/artifacts/7d/95/03/5d2a65ace88ae9a9ce9134b927b1e9639c8680c13a31e58425deae55d1/apache_airflow_providers_redis-2.0.1-py3-none-any.whl
   apache-airflow-providers-sqlite @ 
file:///root/.cache/pypoetry/artifacts/ec/e6/a3/e0d81fef662ccf79609e7d2c4e4440839a464771fd2a002d252c9a401d/apache_airflow_providers_sqlite-2.0.1-py3-none-any.whl
   ```
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   We are using the Sentry integration
   
   ### What happened
   
   An exception within LocalTaskJobs mini scheduler was handled incorrectly by 
the Sentry integrations 'enrich_errors' method. This is because it assumes its 
applied to a method of a TypeInstance task
   
   ```
   TypeError: cannot pickle 'dict_keys' object
     File "airflow/sentry.py", line 166, in wrapper
       return func(task_instance, *args, **kwargs)
     File "airflow/jobs/local_task_job.py", line 241, in 
_run_mini_scheduler_on_child_tasks
       partial_dag = task.dag.partial_subset(
     File "airflow/models/dag.py", line 1487, in partial_subset
       dag.task_dict = {
     File "airflow/models/dag.py", line 1488, in <dictcomp>
       t.task_id: copy.deepcopy(t, {id(t.dag): dag})  # type: ignore
     File "copy.py", line 153, in deepcopy
       y = copier(memo)
     File "airflow/models/baseoperator.py", line 970, in __deepcopy__
       setattr(result, k, copy.deepcopy(v, memo))
     File "copy.py", line 161, in deepcopy
       rv = reductor(4)
   
   AttributeError: 'LocalTaskJob' object has no attribute 'task'
     File "airflow", line 8, in <module>
       sys.exit(main())
     File "airflow/__main__.py", line 40, in main
       args.func(args)
     File "airflow/cli/cli_parser.py", line 48, in command
       return func(*args, **kwargs)
     File "airflow/utils/cli.py", line 91, in wrapper
       return f(*args, **kwargs)
     File "airflow/cli/commands/task_command.py", line 238, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File "airflow/cli/commands/task_command.py", line 64, in 
_run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File "airflow/cli/commands/task_command.py", line 121, in 
_run_task_by_local_task_job
       run_job.run()
     File "airflow/jobs/base_job.py", line 245, in run
       self._execute()
     File "airflow/jobs/local_task_job.py", line 128, in _execute
       self.handle_task_exit(return_code)
     File "airflow/jobs/local_task_job.py", line 166, in handle_task_exit
       self._run_mini_scheduler_on_child_tasks()
     File "airflow/utils/session.py", line 70, in wrapper
       return func(*args, session=session, **kwargs)
     File "airflow/sentry.py", line 168, in wrapper
       self.add_tagging(task_instance)
     File "airflow/sentry.py", line 119, in add_tagging
       task = task_instance.task
       ```
   
   ### What you expected to happen
   
   The error to be handled correctly and passed on to Sentry without raising 
another exception within the error handling system
   
   ### How to reproduce
   
   In this case we were trying to backfill task for a DAG that at that point 
had a compilation error. This is quite an edge case yes :-)
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to