If you want to keep the rest of your history you can:

1. turn the DAG off
2. delete its bad tasks, delete the bad DAG run
3. turn the DAG on
4. let it backfill or hit the play button manually depending on your needs

Unfortunately this does not keep the task you are working with, but it's
better than dropping the database by far.





Best,

Trent Robbins
Strategic Consultant for Open Source Software
Tau Informatics LLC
desk: 415-404-9452
cell: 513-233-5651
tr...@tauinformatics.com
https://www.linkedin.com/in/trentrobbins

On Wed, Feb 7, 2018 at 2:57 PM, Ananth Durai <vanant...@gmail.com> wrote:

> We can't do that, unfortunately. Airflow schedule the task based on the
> current state in the DB. If you would like to preserve the history one
> option would be to add instrumentation on airflow_local_settings.py
>
> Regards,
> Ananth.P,
>
>
>
>
>
>
> On 5 February 2018 at 13:09, David Capwell <dcapw...@gmail.com> wrote:
>
> > When a production issue happens it's common that we clear the history to
> > get airflow to run the task again.  This is problematic since it throws
> > away the history making finding out what real happened harder.
> >
> > Is there any way to rerun a task without deleting from the DB?
> >
>

Reply via email to