I have an application that uses libevent timers to trigger periodic
database operations.

Following is the general overview of how the timer is used.
1. A timer is set to trigger at a specific time.
2. The process continues to run other operations.
3. The timer triggers at the right time, does DB operations, sets the timer
to triggered after a specific number of seconds (within the next 5 minutes).
4. The process continues to run other operations.
5. The timer triggers but earlier than the number of seconds set in step 3.

The first few timer triggers look correct. After that initially the timer
seems to trigger only a few seconds earlier than the number of seconds set
in step 3. But it keeps getting worse. It looks like the magnitude of error
in how early the timer is triggered depends on the length of DB operations
done inside the timer's event handler function. Eventually the timer
triggers at unpredictable times and it's a mess because the DB operations
happen at weird times.

Any idea why this might be happening?

Thanks,
Elan.

Reply via email to