Hello All,

I am facing some weird problem while doing some processing through
AppEngine's RemoteAPI. Here is the scenario:

* I have more 600k entries in a table in datastore. I am using a
sharded counter which gave me this count information.
* I am trying to iterate over ALL entries in my datastore using the
code snippet given below.
* What's happening is that this code is not reading ALL 600k entries
from the datastore. The loop finishes after 125k iterations! It should
have processed all 600k entries. But, the loop is finishing early. I
think the issue is with '__key__' field of AppEngine's Datastore.

Can any help me in debugging this problem?

Thanks,
Ankur



--------------------
def download_data(my_datastore_table):
        KIND = my_datastore_table
        batchsize = 200
        cnt = 0
        num = 0

        results = KIND.all().order('__key__').fetch(batchsize)
        while results:
                num_fetched = len(results)
                last_key = results[-1].key()
                try:
                        doSomethingWithResultsresults)
                        cnt = cnt + num_fetched
                        print "%s records processed" % cnt
                except:
                        traceback.print_exc()

                results = KIND.all().filter('__key__ >', last_key).order
('__key__').fetch(batchsize)
--------------------

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to