Martin Thanks for the info. But I explicitly created all the necessary indexes.
When debugging this issue I felt there might be a problem in building the indexes, if I just open and save the missing record then that record is also included in the resultset. So I think when I the record updation has to do something with the query execution. Any clues..? Adhi On Nov 9, 10:28 am, Jason Smith <j...@proven-corporation.com> wrote: > I have the same problem, which I wrote about on Stack Overflow but > received no response. > > http://stackoverflow.com/questions/1691792/query-gqlquery-order-restr... > > My models require the property in question and I manually confirmed > that they are all present, so it is not an issue of queries not > returning entities with missing properties. I am stuck with this > problem, and currently I am working around it by fetching all data and > sorting in memory. Fortunately I can get away with that as it's a > small data set and in infrequent query. > > On Nov 7, 12:53 am, Adhi <adhi.ramanat...@orangescape.com> wrote: > > > > > Yes, I've tried using order by also. But its giving different > > resultset. > > When using order by I got only 842 records, but with out order by I > > got 1251 > > where as my actual records will be >1260. and when I change the fetch > > size > > I'm getting different count. > > > Here is my code... > > > def get_serialized_data(entityClass, params): > > query = entityClass.all() > > query.order('__key__') > > > for filterColumn, filterValue in params.iteritems(): > > query.filter(filterColumn, filterValue) > > limit = 400 > > offset = 0 > > totalLimit = 800 > > lastRecordKey = None > > n = 0 > > entities = query.fetch(limit, offset) > > while entities and offset <= (totalLimit-limit): > > lastRecordKey = entities[-1].key() > > n += len(entities) > > # My serialization code here > > offset+=limit > > if len(entities)==limit: > > entities = query.fetch(limit, offset) > > else: > > entities = None > > entities = None > > return (n>=totalLimit, lastRecordKey) > > > def download_data(): > > params = {'ApplicationId':applicationId, 'Deleted':False, > > 'SheetMetadataId':'Sheet003'} > > (moreRecords, lastRecordKey) = get_serialized_data(PrimaryData, > > params) > > while moreRecords: > > params['__key__ >'] = lastRecordKey > > (moreRecords, lastRecordKey) = get_serialized_data > > (PrimaryData, params) > > > download_data() > > > Each batch will fetch 800 records if I use q.fetch(800) its giving > > Timeout so I've used offset. > > As per the documentation > > inhttp://code.google.com/appengine/articles/remote_api.html > > they haven't specified > > to add order by for __key__ so I thought its implicit. Thats why I > > initially tried with out order by. > > Am I doing anything wrong? > > > Now I'm trying to delete and recreating the indexes because of this > > problem, but it still in deleting state. > > > Adhi > > > On Nov 6, 7:13 pm, Eli Jones <eli.jo...@gmail.com> wrote: > > > > Always post a full code snippet. > > > > Aren't you supposed to use Order By when paging by key? > > > > On 11/6/09, Adhi <adhi.ramanat...@orangescape.com> wrote: > > > > > Hi, > > > > Sometimes I am not getting the complete resultset fromgqleven though > > > > the missing records satisfies the condition. I've proper indexes. > > > > Total records for that query will be around 1300. So, I'm not fetching > > > > the records in a single fetch, I'm using __key__ > last_record_key to > > > > get in batches. > > > > > Why is this anomaly..? anything I am missing here. > > > > > Adhi > > > > -- > > > Sent from my mobile device --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~----------~----~----~----~------~----~------~--~---