I successully downloaded the local datastore into 3 separate CSV
files: Item.csv (~300kb), Type.csv (30kb) and Picture.csv (~300mb).
Uploading the 2 small was also successful.
However, the last file provoked the following error:

$ bulkloader.py --restore --kind=Picture --filename=Picture.csv --
url=http://<app>.appspot.com/remote_api <app>/
[INFO    ] Logging to bulkloader-log-20090904.120446
[INFO    ] Throttling transfers:
[INFO    ] Bandwidth: 250000 bytes/second
[INFO    ] HTTP connections: 8/second
[INFO    ] Entities inserted/fetched/modified: 20/second
[INFO    ] Opening database: bulkloader-progress-20090904.120446.sql3
[INFO    ] Connecting to immo-willaerts.appspot.com/remote_api
Please enter login credentials for <app>.appspot.com
Email: folke...@gmail.com
Password for folke...@gmail.com:
[INFO    ] Starting import; maximum 10 entities per post
[ERROR   ] [Thread-2] WorkerThread:
Traceback (most recent call last):
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/tools/adaptive_thread_pool.py", line 150, in
WorkOnItems
    status, instruction = item.PerformWork(self.__thread_pool)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/tools/bulkloader.py", line 675, in PerformWork
    transfer_time = self._TransferItem(thread_pool)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/tools/bulkloader.py", line 832, in _TransferItem
    self.request_manager.PostEntities(self.content)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/tools/bulkloader.py", line 1239, in PostEntities
    datastore.Put(entities)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/api/datastore.py", line 169, in Put
    apiproxy_stub_map.MakeSyncCall('datastore_v3', 'Put', req, resp)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/api/apiproxy_stub_map.py", line 72, in MakeSyncCall
    apiproxy.MakeSyncCall(service, call, request, response)
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/api/apiproxy_stub_map.py", line 255, in MakeSyncCall
    rpc.CheckSuccess()
  File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/
GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/
google/appengine/api/apiproxy_rpc.py", line 111, in CheckSuccess
    raise self.exception
RequestTooLargeError: The request to API call datastore_v3.Put() was
too large.
[INFO    ] Backing off due to errors: 1.0 seconds
[INFO    ] Unexpected thread death: Thread-2
[INFO    ] An error occurred. Shutting down...
[ERROR   ] Error in Thread-2: The request to API call datastore_v3.Put
() was too large.

[INFO    ] 1020 entites total, 0 previously transferred
[INFO    ] 0 entities (0 bytes) transferred in 30.0 seconds
[INFO    ] Some entities not successfully transferred


Each row stored an image which size is ~100-300kb.
Should I tried using Loader classes ?

Thanks for you help

On Sep 4, 10:19 am, FolkenDA <folke...@gmail.com> wrote:
> I will try that, thank you !
>
> On Sep 4, 4:22 am, Matthew Blain <matthew.bl...@google.com> wrote:
>
> > One of the new features of today's release is a --dump and --restore
> > flag in the bulkloader, allowing configurationless download and
> > restore of a dataset. It should work for exactly this scenario: you
> > have data in one instance (say on dev_appserver) and want to upload it
> > to another (say on appspot.com).
>
> > You can find more information 
> > athttp://code.google.com/appengine/docs/python/tools/uploadingdata.html...
>
> > I'm not sure about the datastore_v3 error, but it may also be
> > something we fixed.
>
> > --Matthew
>
> > On Sep 3, 11:35 am, FolkenDA <folke...@gmail.com> wrote:
>
> > > Hello everyone,
>
> > > I'm new to GAE and I am currently trying to move a website of mine to
> > > this platform.
>
> > > re-write the PHP code into python was really easy. However, I'm now
> > > trying to move data from an mySQL database into the GAE datastore.
>
> > > I was previously storing images in a directory in the server file
> > > system, so I successfully wrote a script that fetch them and store
> > > them in the local datastore.
> > > The script works in dev server because they have no timeout, but the
> > > GAE servers do.
>
> > > I tried to download the local datastore into CSV files that could be
> > > uploaded using the bulkloader method but got the following error:
>
> > > AssertionError: No api proxy found for service "datastore_v3"
>
> > > I guess dev servers do not emulate this functionality.
> > > Is there some way I could use to upload local datastore to the
> > > production one ?
>
> > > Thank you,
>
> > > Folken
>
>
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to