GitHub user apmechev edited a comment on the discussion: "XCom not found" error 
thrown by EMR Sensors after Airflow 3.0

Hi, thanks for the reply!
We're using the default XCom Backend (database). Here are all the Xcoms for a 
task that raised the error. It has the `_link_*` xcoms pushed (with an empty 
value). I'm not sure where it's being pushed from as I don't see those keys in 
our code. I do see it eventually coming from 
[here](https://github.com/apache/airflow/blob/bf34806f083ef075a1c23a323a77f2a022bba5cb/task-sdk/src/airflow/sdk/bases/operatorlink.py#L56)
 though.
LMK if I can provide any more context!

```
airflow=> select * from xcom where dag_run_id=640179 and 
task_id='ycp-reporting.check_steps';
 dag_run_id |          task_id          |         key          | value |        
   timestamp           |           dag_id           |                run_id     
           | map_index
------------+---------------------------+----------------------+-------+-------------------------------+----------------------------+--------------------------------------+-----------
     640179 | ycp-reporting.check_steps | _link_EmrClusterLink | ""    | 
2025-06-23 09:17:45.268368+00 | ycp-reporting-pipeline-dev | 
scheduled__2025-06-23T09:00:00+00:00 |        -1
     640179 | ycp-reporting.check_steps | _link_EmrLogsLink    | ""    | 
2025-06-23 09:17:45.318799+00 | ycp-reporting-pipeline-dev | 
scheduled__2025-06-23T09:00:00+00:00 |        -1
```
Related Error log:
<details>
<summary>Datadog Error log</summary>

```
{
  "run_id": "scheduled__2025-06-23T09:00:00+00:00",
  "status_code": 404,
  "level": "error",
  "map_index": -1,
  "task_id": "ycp-reporting.check_steps",
  "dag_id": "ycp-reporting-pipeline-dev",
  "detail": {
    "detail": {
      "reason": "not_found",
      "message": "XCom with key='emr_logs' map_index=-1 not found for task 
'ycp-reporting.check_steps' in DAG run 'scheduled__2025-06-23T09:00:00+00:00' 
of 'ycp-reporting-pipeline-dev'"
    }
  },
  "event": "XCom not found",
  "key": "emr_logs",
  "timestamp": "2025-06-23T09:17:45.301904Z",
  "logger": "airflow.sdk.api.client"
}
```
</details>

<details>
<summary>Airflow Config.tpl</summary>

```
dags_folder = /s3-sync/dags
executor = {{ AIRFLOW_EXECUTOR }} --KubernetesExecutor
auth_manager = 
airflow.providers.fab.auth_manager.fab_auth_manager.FabAuthManager
plugins_folder = /usr/local/airflow/plugins
fernet_key = {{ FERNET_KEY }}
donot_pickle = False
dagbag_import_timeout = 300
sensitive_var_conn_names = _TOKEN,_API_KEY,SENTRY_DSN,PASSWORD,SECRET
execution_api_server_url = http://airflow.airflow.svc.cluster.local/execution/
[database]
sql_alchemy_conn = postgresql+psycopg2://{{ POSTGRES_USER }}:{{ 
POSTGRES_PASSWORD }}@{{ POSTGRES_HOST }}:{{ POSTGRES_PORT }}/{{ POSTGRES_DB }}
sql_alchemy_pool_size = 20
sql_alchemy_pool_recycle = 3600
[logging]
base_log_folder = /usr/local/airflow/logs
logging_level = WARNING
logging_config_class = json_logging.LOGGING_CONFIG
[metrics]
statsd_on = True
[traces]
[secrets]
[api]
secret_key = {{ JWT_SECRET_KEY }}
base_url = {{ BASE_URL }}
port = 5000
worker_timeout = 300
[workers]
[api_auth]
jwt_secret = {{ JWT_SECRET_KEY }}
[execution_api]
[lineage]
[operators]
[webserver]
instance_name = {{ ENVIRONMENT }}
[email]
email_backend = airflow.providers.amazon.aws.utils.emailer.send_email
email_conn_id = aws_default
html_content_template = /usr/local/airflow/email_templates/error_alert.html
from_email = [email protected]
[smtp]
[sentry]
[scheduler]
scheduler_heartbeat_sec = 20
[triggerer]
[kerberos]
[sensors]
[dag_processor]
refresh_interval = 311
[aws]
[aws_batch_executor]
[aws_ecs_executor]
[aws_auth_manager]
[local_kubernetes_executor]
[kubernetes_executor]
logs_task_metadata = True
pod_template_file = {{ AIRFLOW_HOME }}/pod-template.yml
worker_container_repository = ---.dkr.ecr.---.amazonaws.com/airflow
worker_container_tag = {{ RELEASE_NUMBER }}
namespace = airflow
worker_pods_creation_batch_size = 6
in_cluster = {{ RUNNING_IN_AWS }}
config_file = {{ AIRFLOW_HOME }}/kube_config
delete_option_kwargs = {"grace_period_seconds": 60}
[common.io]
[fab]
[standard]
[usage_data_collection]
enabled = False
```

</details>

GitHub link: 
https://github.com/apache/airflow/discussions/51652#discussioncomment-13549402

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to