Doing it over the remote api means you are going to transfer all your data +
transmission overhead over the wire. You are probably better off doing
something like this on the server side through an admin protected handler.

Also if you happen to know the keys of your data (you used key_name) your
deletes are going to be a lot more efficient if you give db.delete a list of
keys instead.

On Sat, Apr 25, 2009 at 2:41 PM, Sri <sri.pan...@gmail.com> wrote:

>
> Hi,
>
>    Is there a way to completely erase the production data store?
>
> Currently I am using a script like this via the remote api:
>
> def delete_all_objects(obj_class):
>    num_del = 300
>    while True:
>        try:
>            objs = obj_class.all().fetch(1000)
>            num_objs = len(objs)
>            if num_objs == 0:
>                return
>            print "Deleting %d/%d objects of class %s" % (num_del,
> num_objs, str(obj_class))
>            db.delete(objs[:num_del])
>        except Timeout:
>            print "Timeout error - continuing ..."
>
> But with 30000 entities in the data store and another 3 million (yep
> thats right) coming, doing a clear this way is extremely slow.
>
> Any ideas?
>
> cheers
> Sri
> >
>


-- 

Alkis

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to