Actually, even just fetching one entity by key will frequently cause a Timeout. My logs are full of these...
On Apr 30, 2:59 pm, Sylvain <sylvain.viv...@gmail.com> wrote: > For my app, I never fetch more than 250 entities because I've seen > that if this values is bigger you raise too many datastore timeouts. > But even with 250 entities (with a very basic Kind) something I get atimeout. > > One "funny" thing is that you can fetch up to 1000 entities (whatever > kind, number of attributes,...) but in the fact it doesn't work ->timeout. > > On 30 avr, 17:45, barabaka <oleg.g...@gmail.com> wrote: > > > Well, I've read a lot of posts about google datastore and the problems > > with batch operations, relational approach to arrange data in bigtable > > etc. but I always thought the problem wasn't in datastore itself but > > in the way people use it. Now I can see with my experience that it > > acts just in an unpredictable way. I deployed a test java app that > > tries to clear 500 (guaranteed amount!) entries per request. All > > entries are in the same entity group and delete is executed in batch > > in single transaction. All operations are executed with low level API > > so no possible overhead is involved. Here is a sample code and logs: > > > Code (cut): > > ============= > > Query q = new Query(World.class.getSimpleName()); // create query > > Iterator<Entity> i = datastoreService.prepare(q).asIterator(); > > idx = 0; > > while (i.hasNext() && idx<500) { > > keys.add(i.next().getKey()); > > idx++; > > > } > > > // delete keys in batch > > Transaction t = datastoreService.beginTransaction(); > > datastoreService.delete(keys); > > t.commit(); > > ============== > > > 1st request (all goes well, 500 entries removed) > > ------------------------------------------------- > > 1. > > I 04-30 07:52AM 02.091 org.itvn.controller.TvnController > > clearDbBySize: Reading 500 entity keys... > > See details > > 2. > > I 04-30 07:52AM 03.832 org.itvn.controller.TvnController > > clearDbBySize: Removing keys by groups, total groups: 1 > > 3. > > I 04-30 07:52AM 03.832 org.itvn.controller.TvnController > > clearDbBySize: Trying to remove 500 entities... > > 4. > > I 04-30 07:52AM 07.873 org.itvn.controller.TvnController > > clearDbBySize: Removed 500 entities. > > > 2nd request -timeoutexception, on READ operation (i.hasNext()) > > ------------------------------------------------- > > 1. > > I 04-30 07:52AM 22.719 org.itvn.controller.TvnController > > clearDbBySize: Reading 500 entity keys... > > See details > > 2. > > W 04-30 07:52AM 26.551 Nested in > > org.springframework.web.util.NestedServletException: Request > > processing failed; nested exception is > > com.google.appengine.api.datastore.Datas > > 3. > > W 04-30 07:52AM 26.552 /clear_db/500 > > com.google.appengine.api.datastore.DatastoreTimeoutException: > > datastoretimeout: operation took too long. at > > com.google.appengine.api.d > > 4. > > C 04-30 07:52AM 26.555 Uncaught exception from servlet > > com.google.appengine.api.datastore.DatastoreTimeoutException: > > datastoretimeout: operation took too long. at com.goog > > > Here we go, first request executes well, and the next (only a few > > seconds later) fails! Note that this is only a test application, with > > no load at all. Am I doing something wrong? What's the RELIABLE way to > > read/remove 500 entities? Is it a problem with quantity (500)? If so > > how much entities could be read withouttimeout? Can someone give the > > reasonable answer to this? If you need more details about app, I can > > share this test case in public. > > > Oleg > > --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~----------~----~----~----~------~----~------~--~---