On Friday, August 17, 2012 5:52:33 PM UTC+2, Jeff Schnitzer wrote: > > > It won't fit into the free quota (just to rebuild the download once an > hour will be hundreds of thousands of read ops per day)
Group articles and bundle those together as mini-blobs before creating the large downloadable blob. Recalculate only the mini blobs that have changes, once every e.g. 10 minutes. Recalc the bigger one every hour. Not sure how naturally grouped the data is, but even an artificial grouping of e.g. 500 articles per blob grouped by id should work. If there's a way to put oft-changed articles together you'll save more. Richard -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To view this discussion on the web visit https://groups.google.com/d/msg/google-appengine/-/2W3NnYSvwHcJ. To post to this group, send email to google-appengine@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.