Hey,

My app has about 800MB of data that I need to delete. So I made a
cronjob to run every minute that runs the following code:

while TrackingResult.all().count(1) > 0:
            db.delete(db.GqlQuery("SELECT __key__ FROM
TrackingResult").fetch(100))

TrackingResult is my entity model with 800MB of data i want gone. The
problem is, I've had this running for 2 days now and only ~350MB has
been deleted. In fact, after about 200MB of data is deleted, all of my
CPU quota is used up after a few hours and I must wait until the next
reset to finish deleting the data.

I have no idea why this is taking so long or if there is a more
efficient way to do this. I do have about 15 indexes, so maybe that
can be adding to this all? I really just wish there was a way to
"truncate" the table...

Does anybody have any help/advice as to what I may be doing wrong? The
entity is a Expando model and has no more than 10-12 properties on it,
one of which is a reference property (but all references were already
deleted).
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to