Hey Eli,
Thanks for the additional info. I'll have to watch for similar
issues, as we are getting ready to test loading and deleting lots of
data again.
Just a note, we use self-sizing batches to help speed up the
deletes. We with batch sizes of 100 then keep stepping it up by 15%
until we h
Do you mean on the Main page of the Dashboard for my app? No, there was no
indication of throttling there.
Mainly, I would get timeouts from the appengine_console.py when running this
command:
result = db.GqlQuery("Select __key__ from MyBigModel").fetch(1)
while fetching another Model would wor
Hi Eli,
Did you happen to look at the App Engine Console after the datastore
was "napping"? A few months ago when clearing a test datastore we hit
a similar thing. When we looked at the App Engine Console it said the
app was being temporarily throttled.
Was just curious if that is what you e
As a side note.. once I hit about 250,000 entities deleted it seems the
datastore took a nap on me..
So, now I'm waiting for it to finish whatever it is doing underneath before
I can continue deleting.
Took a nap = db.delete() or .fetch(1) from the model times out.
Though, I can .fetch() from my
I am currently going through the process of deleting 500,000 entities from
my datastore.
Here are the different stats I have so far
db.delete() for:
100 entities = 2,179 API_CPU
200 entities = 4,345 API_CPU
500 entities = 10,845 API_CPU
So.. it doesn't seem like you get better per entity API_CP
Thanks, Andrew. But after the change, it still costs me 2130cpu_ms
2112api_cpu_ms :)
On Sat, Feb 20, 2010 at 5:22 PM, Andrew Chilton wrote:
> On 20 February 2010 21:21, kang wrote:
> > I'm going to clear the datastore. I use the following code:
> > old_date = datetime.datetime(2009,10,1)
>
On 20 February 2010 21:21, kang wrote:
> I'm going to clear the datastore. I use the following code:
> old_date = datetime.datetime(2009,10,1)
> old_updates = SomeUpdate.all().filter("updated <",old_date).fetch(20)
> db.delete(old_updates)
> it costs me nearly 1982cpu_ms 1945api_cpu_ms
I'm going to clear the datastore. I use the following code:
old_date = datetime.datetime(2009,10,1)
old_updates = SomeUpdate.all().filter("updated <",old_date).fetch(20)
db.delete(old_updates)
it costs me nearly 1982cpu_ms 1945api_cpu_ms every time. Is it normal?
--
Stay hungry,Sta