brouberol opened a new issue, #48418:
URL: https://github.com/apache/airflow/issues/48418

   ### Apache Airflow version
   
   2.10.5
   
   ### If "Other Airflow 2 version" selected, which one?
   
   `apache-airflow==2.10.5+astro.2`
   
   ### What happened?
   
   [Operator extra 
links](https://airflow.apache.org/docs/apache-airflow/stable/howto/define-extra-link.html)
 don't seem to appear when the associated operator is executed within mapped 
tasks, even when running airflow 2.10.5, which includes the fix for 
https://github.com/apache/airflow/issues/43757.  
   
   ### What you think should happen instead?
   
   Operator extra links should appear on mapped tasks, the same way they do on 
regular (non-mapped) tasks.
   
   ### How to reproduce
   
   I have been able to reproduce this in 2 different environments:
   - an `astro` dev environment
   - `breeze start-airflow` with a local checkout of `airflow`, on the `2.10.5` 
 git tag
   
   ## Astro
   ```
   $ pip install astro
   $ mkdir astro-repro
   $ cd astro-repro
   $ astro dev init
   ```
   I then define the following file under `plugins/grafana_link.py`:
   ```python
   import logging
   
   from airflow.models.baseoperator import BaseOperator
   from airflow.models.baseoperatorlink import BaseOperatorLink
   from airflow.models.taskinstancekey import TaskInstanceKey
   from airflow.plugins_manager import AirflowPlugin
   from airflow.operators.python import PythonOperator`
   
   logger = logging.getLogger(__name__)
   
   
   class GrafanaDashboardKubernetesPodResourcesLink(BaseOperatorLink):
       """Generate a link to a Grafana dashboard displaying the operator's 
managed pod resource usage"""
   
       name = "GrafanaDashboardKubernetesPodResources"
   
       operators = [PythonOperator]
   
       def get_link(self, operator: BaseOperator, *, ti_key: TaskInstanceKey) 
-> str:
           log.info("Running 
GrafanaDashboardKubernetesPodResourcesLink.get_log")
           return "https://grafana.wikimedia.org";
   
   
   class AirflowWikimediaDumpOperatorExtraLinkPlugin(AirflowPlugin):
       name = "pod_resources_grafana_dashboard_link"
       operator_extra_links = [
           GrafanaDashboardKubernetesPodResourcesLink(),
       ]
   ```
   
   I then define the following dag, under `dags/extra_link_mapped_dag.py`:
   ```python
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   
   
   def addition(x, y):
       return x + y
   
   
   with DAG(dag_id="test_extra_link_mapped", schedule=None, catchup=False):
       PythonOperator.partial(
           task_id="simple_addition", python_callable=addition
       ).expand_kwargs([{"op_kwargs": {"x": 1, "y": 2}}, {"op_kwargs": {"x": 3, 
"y": 4}}])
   ```
   
   With that setup, I'm able to see 2 mapped tasks being executed. However, 
their individual page does not feature the expected extra link.
   
   
![Image](https://github.com/user-attachments/assets/b2f4393e-cf4d-4445-a347-9f08e0a02644)
   
   Looking at the scheduler logs, I'm not seeing the `Running 
GrafanaDashboardKubernetesPodResourcesLink.get_log` expected INFO log.
   
   ## Breeze
   
   I'm observing the exact same behavior when running `breeze start-airflow` 
with a local checkout of airflow, on tag `2.10.5`.
   
   ### Operating System
   
   macOS (although airflow is runnnig in Docker) 
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   I'm able to reproduce this issue every time.
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to