While working on a migration tool from Blobstore to Google Cloud Storage, I 
was surprised by a series of consistent, frequent InternalError while 
fetching from Blobstore. The errors happen after a few attempts to read 
from particular byte ranges, but I'm not sure if they are instead a limit 
imposed internally to avoid your app to consume resources too fast. I was 
unable to find anything from any documentation page describing any limits 
in that regard.

The code fetching blobs is the following, failing at some particular byte 
ranges, repeatedly (when task retry happen; eventually it works, but 
sometimes it don't):

while True:
  bytes = blobstore.fetch_data(blob_key, start, start + length-1)
  if bytes == "":
    break
  fetch_count += 1
  start += length
  write_to_gcs(bytes, gcs_handle)
  if fetch_count >= 16: #magic number
    reschedule_this_task(gcs_handle, start)

Length here is about 512 * 1024, to stay lower than the max download size 
from blobstore. Strange fact is: if I download the same blob, reupload to 
another appid, then the code works without any issues.

Anyone had similar erros in the past? Could this be a problem with the 
blobstore of the production app?

Kind regards,

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To post to this group, send email to google-appengine@googlegroups.com.
Visit this group at http://groups.google.com/group/google-appengine.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to