Perhaps the page has been protected so you need UserAgent headers,
etc. Anyway, check the Logs section in Dashboard to check for errors.

On 17 feb, 23:18, theone <maliha...@gmail.com> wrote:
> Actually for my case using google docs api very complicated and tiring
> because I just want to get the content to make search in files. I
> won't make any modification on the document. I think that using an
> external service likehttps://secure.convert-doc.com/converters/doc-to-txt.html
> might be appropriate solution.
>
> I tried using urlfetch
>
> file = self.request.get("file")
>                 form_fields = {
>                                                 "input_file": file,
>                                                 "output_type": "txt",
>                                                 "output_method": "1",
>                                                 ".cgifields": "output_method"
>                                            }
>                 form_data = urllib.urlencode(form_fields)
>                 result = 
> urlfetch.fetch(url="https://secure.convert-doc.com/convert-
> file",
>                         payload=form_fields,
>                         method=urlfetch.POST,
>                         headers={'Content-Type': 'multipart/form-
> data'})
>                 print result.content
>
> but it gave Internal Server Error. I don't know what is wrong with it.
> The system seems working normally but I could not run it with
> urlfetch.
>
> On 17 Şubat, 17:12, Ernesto Karim Oltra <ernestoka...@gmail.com>
> wrote:
>
>
>
>
>
>
>
> > Taskqueues, that's what I'm using now. To update a word document of
> > about ten pages, it takes more or less 1500ms average (at least for me
> > =) ). And this have a big advantage, you can take control of retries
> > if docs is temporaly unavailable.
>
> > Another useful option would be to store a cache in datastore with some
> > common used data. For example, i have a downloads section, containing
> > some documents stored in google docs. Hourly (it may be five minutes
> > or 3 days, whatever you want) cron gets the titles, the URL of the
> > document, etc. and save in datastore. Then I haven't got to deal with
> > docs when building responses to a user, only use the data cached
> > locally.
>
> > On 17 feb, 02:41, Calvin <calvin.r...@gmail.com> wrote:
>
> > > I think he means that importing and retrieving the converted document 
> > > using
> > > the gdata api may not always be possible within the 30 second limit of a
> > > user-facing app engine request.
>
> > > If that's the case it would be a good idea to do the conversion using a 
> > > task
> > > queue, which has a much higher limit.
>
> > > Another thing to keep in mind is that Google Docs has a size limit on word
> > > documents that it will import.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to