bhavaniravi opened a new issue #9367: URL: https://github.com/apache/airflow/issues/9367
**Description** Need a way to access logs of tasks that are executed at a particular instance of time but are not currently a part of the DAG. **Use case / motivation** I have a use case where tasks are generated based on the number of entries created on a DB after a timestamp The dag would look something like this ``` for study in read_db(): PythonOperator( task_id="python_task_" + study["pk"] execute_callable=process_data op_kwargs={study: study} ) ``` Each task has a unique id based on the primary key of the entry in the DB. Once the item is processed, it is marked success and won't be processed again, thereby removing it from tasks. On trying to access logs of these tasks I get the following error <img width="1419" alt="image" src="https://user-images.githubusercontent.com/10116000/84975691-ad801900-b143-11ea-85a8-6f715b0216df.png"> <img width="1140" alt="image" src="https://user-images.githubusercontent.com/10116000/84975644-92ada480-b143-11ea-9061-ea3fcb1e21a5.png"> Despite tasks not being a part of the DAGs at the given instance of time, they are displayed in the `task instance` page. But the error pops up on trying to access the logs ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org