Well, Bulk Data Uploader seems definitely very useful. What if I would
like to update the records? (every now and then)

Thanks, chr

On Jan 11, 3:31 pm, Greg Temchenko <soid....@gmail.com> wrote:
> I mean GAE data uploader. You can read 
> here:http://code.google.com/appengine/articles/bulkload.html
>
> It splits a csv file into 10 lines requests and inserts it step by
> step.
>
> On Jan 11, 3:45 pm, gabon <nuthink...@googlemail.com> wrote:
>
> > I started optimizing everywhere. Now I have a textarea input where I
> > paste a json string (I presume is faster to parse than xml). I still
> > got problems.
>
> > Is it really the solution to split the operation in more manual steps?
> > Is it not possible to have an automatized and longer process, giving
> > for instance breaks to the CPU with time.sleep() ?
>
> > How could I split in an automatic way the update of thousands of
> > entities? I am thinking on redirecting the page to a new url passing
> > the data to update and every time processing some. It sounds pretty
> > crazy, but if the limit is the time to generate a page I don't see
> > other solutions.
>
> > With bulkloader, you mean the actionscript 3 library?
>
> > Thanks, chr
>
> > On Jan 11, 12:18 pm, Greg Temchenko <soid....@gmail.com> wrote:
>
> > > I guess you have to split your XML file and work with it by steps.
> > > Did you see how bulkloader works?
>
> > > On Jan 11, 2:04 pm, gabon <nuthink...@googlemail.com> wrote:
>
> > > > I would like to use GAE with the data created and maintained in a
> > > > different application in a different server. My solution was to
> > > > generate an xml file with all the data and parse it to create/update
> > > > the GAE related entities.
> > > > Clearly this is not a CPU friendly solution (especially considering
> > > > that fetch operations are considered as CPU operations!) and I get a
> > > > nice "Dude, this is whack!" message with errors and CPU quota warning
> > > > in the logs. I will try to copy the xml MANUALLY in an input field to
> > > > get rid of the errors (this procedure looks whack to me!), but since
> > > > the entities to update are thousands, I have the feeling that won't be
> > > > enough.
>
> > > > Is there any other recommended way to work with large data without
> > > > creating CPU issues? I don't know using time.sleep() to give the CPU a
> > > > break? It seems the CPU errors are pretty common, probably Google
> > > > should give more info about how to avoid them.
>
> > > > Thanks, chr
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to