[google-appengine] Re: Working with a proxy for data storage and URL fetch
On Wed, Dec 3, 2008 at 2:30 PM, [EMAIL PROTECTED] [EMAIL PROTECTED]wrote: It shoudn't be a problem, replication will just take longer. The replication is of course broken down into many small requests, in order to avoid AppEngine request timeouts and running over high-CPU quota. As far as I know, currently there isn't really any faster way to load data into AppEngine datastore. Yes, I did a little analysis and gave a try with app rocket. You are correct. Any ways, It won't support for storing huge amount of data in the datastore right? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Working with a proxy for data storage and URL fetch
On Tue, Dec 2, 2008 at 2:52 PM, [EMAIL PROTECTED] [EMAIL PROTECTED]wrote: Have you checked out this http://code.google.com/p/approcket/ It allows you to setup replication from MySQL to AppEngine - I guess that's what you are trying to achieve. Any ways, It won't support for storing huge amount of data in the datastore right? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Local datastore import is too slow
I am also havin the same problem. Could you please clarify me if you got the answer On Nov 20, 5:35 am, Jyoti Shete-Javadekar [EMAIL PROTECTED] wrote: Hi, I am trying to load my development datastore using the bulk loader script. However the import is very slow. I had to kill the import process since it was not completed even after 12 hours. I have about 13K rows in the CSV file. One data model entity is about 300 bytes. 10 entities are imported at a time. The model has two unicode attributes, two unicode list attributes , one url and one long attribute. I use unicode.split to populate list attributes. I am running the bulk loader in a virtual machine having 512MB memory. During the import about 91% memory is utilized. I have not specified any custom index in index.yaml. Could someone please tell me why the import is so slow? What optimization should I do to improve the performance? I have not yet loaded the data at appspot. Above observations are for my local development server. Are there any performance numbers available for the bulk loader script? Thanks in advance, Jyoti --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---