Hmm, I have been thinking more on this. Whenever I run a process, I
see some logs being created? What is the information contained here.
Does the bulkloader do anything to optimize it next round of
downloading the data? I see from the docs above and something similar
in my logs:
[INFO] Have 6
Hello Barry
Well, I have nothing fancy. I just use the
http://code.google.com/appengine/docs/python/tools/uploadingdata.html#Downloading_and_Uploading_All_Data
and OS level cron to get every day.
So I can obviously start from something like, where ID > something or
entity.when > something and kee
It should certainly be possible, but if you can is something we can't answer.
To do incremental backups, need to keep some sort of marker on data
downloaded already, and so know not to back it up again in the
incremental backup. **How** exactly you do that highly depends on your
data, and what met
So currently, I run a cron every day and take a complete backup of the data.
Is there anyway - we can do incremental backup?
--
Ritesh
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to google-appengine@