shokosanma commented on issue #26960:
URL: https://github.com/apache/airflow/issues/26960#issuecomment-1272940150
sorry, let me post more detail, my dag is like this
```
from airflow.utils.dates import days_ago
from airflow.decorators import dag
from airflow.sensors.python import PythonSensor
import time
args = {
# Airflow above 1.8.0 require a fixed start_date
"start_date": days_ago(1),
"catchup": None,
"provide_context": True,
"queue": "",
}
@dag(
dag_id="test_sensor",
default_args=args,
schedule_interval=None,
max_active_runs=2,
tags=["test"],
)
def test():
def bo():
for i in range(10):
time.sleep(1)
print(i)
return False
ob = PythonSensor(
task_id="ob",
poke_interval=60,
mode='reschedule',
python_callable=bo,
timeout=3600*6
)
ob
d = test()
```
when the task is in running/up_for_reschedule state, checking log will send
a get requests to
http://localhost:8080/get_logs_with_metadata?dag_id=test_sensor&task_id=ob&map_index=-1&execution_date=2022-10-10T07%3A27%3A13.805132%2B00%3A00&try_number=1&metadata=null

when the task in up_for_reschedule state, manually set task to failed or the
task change to failed state with some unknown reason, the brower will not send
the requests above

the console

--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]