es=[])
info.put()
return
I have no idea why this occurs, but it looks very similar to this
case.
http://www.mail-archive.com/google-appengine@googlegroups.com/msg09235.html
Does anyone know what is happening?
edvakf
--~--~-~--~~~---~--~~
You receive
This issue is now reported.
http://code.google.com/p/googleappengine/issues/detail?id=1975
edvakf
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, sen
;).content
logging.debug('number :' + str(i) + ', text length : ' +
str(len(raw_data)))
==
It looks on the dashboard like this.
http://img.f.hatena.ne.jp/images/fotolife/e/edvakf/20090
dwidth" increases by 0.01 (of 1.00 GBytes) which
is around 10MB.
http://img.f.hatena.ne.jp/images/fotolife/e/edvakf/20090814/20090814002414.png
So I think it's counting 1MB*10 times (not 180KB*10 times).
If you could tell me where I can file this as a bug, it'd be
appreciated.
edvakf
-
Thank you, Nick.
> the content-encoding header should be omitted
That's how I ended up doing in the end.
I have another question. If the .gz file was fetched with Content-
Encoding: gzip, is the URL Fetch Quota counted by the gzipped
filesize, or expanded (original) filesize?
I found the part of the code where gzip file was magically unzipped.
/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/api/urlfetch_stub.py
=
I'm trying to load a remote gzip file onto the Python environment, but
the downloaded file is already unzipped. Hence it gets chopped (or get
a ResponseTooLargeError).
Here is my code.
from google.appengine.ext impo