Taskqueues, that's what I'm using now. To update a word document of
about ten pages, it takes more or less 1500ms average (at least for me
=) ). And this have a big advantage, you can take control of retries
if docs is temporaly unavailable.

Another useful option would be to store a cache in datastore with some
common used data. For example, i have a downloads section, containing
some documents stored in google docs. Hourly (it may be five minutes
or 3 days, whatever you want) cron gets the titles, the URL of the
document, etc. and save in datastore. Then I haven't got to deal with
docs when building responses to a user, only use the data cached
locally.

On 17 feb, 02:41, Calvin <calvin.r...@gmail.com> wrote:
> I think he means that importing and retrieving the converted document using
> the gdata api may not always be possible within the 30 second limit of a
> user-facing app engine request.
>
> If that's the case it would be a good idea to do the conversion using a task
> queue, which has a much higher limit.
>
> Another thing to keep in mind is that Google Docs has a size limit on word
> documents that it will import.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to