For a question about performance improvements in the Google Spreadsheets
API, I think the best place is the Google Docs APIs group:

http://groups.google.com/group/Google-Docs-Data-APIs

You might also want to file an issue in the bug and feature request list for
Google Data APIs:

http://code.google.com/p/gdata-issues/issues/list

Cheers,

Jeff

On Wed, Aug 12, 2009 at 10:06 PM, condor <takayuki.a...@gmail.com> wrote:

>
> Hi, Jeff.
>
> I hope that the execution time of Spreadsheet API become short as good
> as indicating a row_id.
> When I indicate the follow parameter.
>
> > > >  row_query = gdata.spreadsheet.service.ListQuery()
> > > >  row_query.start_index = str(1)
> > > >  row_query.max_results = str(1)
>
> Please tell me that you can resolve the issue.
>
>
> On 7月30日, 午後2:06, condor <takayuki.a...@gmail.com> wrote:
> > Hi, Jeff.
> >
> > Thank you for your answer.
> >
> > >
> http://spreadsheets.google.com/feeds/list/{key}/{worksheetId}/private/full/{rowId}<http://spreadsheets.google.com/feeds/list/%7Bkey%7D/%7BworksheetId%7D/private/full/%7BrowId%7D>
> >
> > When I tried indicating a row_id, It was success in a little seconds.
> >
> > And, I tried indicating the parameter "sq" for small feed size, then
> > It was success too.
> >
> > On 7月30日, 午前8:42, "Jeff S (Google)" <j...@google.com> wrote:
> >
> > > Hicondor,
> >
> > > The maximum deadline for urlfetch is 10 seconds and it might be
> possible
> > > that the spreadsheets API is taking longer than that to reply even if
> the
> > > expected feed size is small. Have you tried to fetch just a single row
> entry
> > > (instead of a feed query)
> >
> > >
> http://spreadsheets.google.com/feeds/list/{key}/{worksheetId}/private/full/{rowId}<http://spreadsheets.google.com/feeds/list/%7Bkey%7D/%7BworksheetId%7D/private/full/%7BrowId%7D>
> >
> > > Please let me know how it goes. Receiving timeouts from these APIs is
> > > frustrating and I'd like to be able to work through this issue with the
> > > Spreadsheets API.
> >
> > > Thank you,
> >
> > > Jeff
> >
> > > On Tue, Jul 28, 2009 at 8:11 PM,condor<takayuki.a...@gmail.com> wrote:
> >
> > > > Hi,
> > > > I have a same situation too.
> >
> > > > When I requested a spreadsheet API by the method
> > > > gdata.spreadsheets.service.GetListFeed of python client liblary
> 2.0.1,
> > > > It's occured the DownLoadError ApplicationError 5.
> >
> > > > File "/base/python_lib/versions/1/google/appengine/api/urlfetch.py",
> > > > line 241, in fetch
> > > >    return rpc.get_result()
> > > >  File "/base/python_lib/versions/1/google/appengine/api/
> > > > apiproxy_stub_map.py", line 442, in get_result
> > > >    return self.__get_result_hook(self)
> > > >  File "/base/python_lib/versions/1/google/appengine/api/urlfetch.py",
> > > > line 331, in _get_fetch_result
> > > >    raise DownloadError(str(err))
> > > > DownloadError: ApplicationError: 5
> >
> > > > I think the reason is not timeout, since the same error occured even
> > > > if I set the deadline parameter.
> > > > (This problem is solved  by the parameter on the SDK.)
> >
> > > >    return HttpResponse(urlfetch.Fetch(url=str(url), payload=data_str,
> > > >      method=method, headers=all_headers, follow_redirects=False,
> > > > deadline=60))
> >
> > > > ↑ gdata.alt.appengine.py # 145
> >
> > > > The spreadsheets has many rows.
> > > > But I am sure the download size is not large, because it's set the
> > > > following parameter.
> >
> > > >  row_query = gdata.spreadsheet.service.ListQuery()
> > > >  row_query.start_index = str(1)
> > > >  row_query.max_results = str(1)
> >
> > > > Please tell me the reason of  this error on GAE.
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to