Hey Dan and Boson,

Thanks very much for your time guys.

When you call:
CategoryRows = db.GqlQuery(QueryString)
It's fast.

It's the loop where you try and access the returned entities that
is slow. Doesn't it follow that the CategoryRows has the keys in it
and the next() in the python loop is fetching them? every 20th
next() call is also takes extra time. Like it's fetching 20 at a
time from some place.

I don't think the problem is the aspects of BigTable. The thing
is just 10 times too slow. It needs to be able to load about
1,000 records at max too make the returned record count a
non-issue. You need about 50 properties to make the property
count a non-issue. The thing needs to return 1,000 entities or
50,000 properties in 0.5 seconds then the problem goes away.

You can cache them for the next user. We just need it to go
off once.

Right now it takes 3 seconds upgraded to '30 processor seconds'
and the thing flips out.

To the smart guys at google. Putting a couple hundred inventory
lines in front of a user seems like something the app engine should
do.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to