I would post to one of my other posts, except that they're still in
moderator-limbo. So I apologize for creating a new thread.

To briefly summarize my project has lots of very small objects. In
particular I've uploaded about 1.45 Million objects with a Long id and
a String name. On average, the strings are about 12 characters in
length. I plan(ned) on adding about 84 Million other objects that
consisted of 4 Longs (including the id). (And this is just for a
subset of my data... which is probably on the order of 2x to 3x larger
in total). Also it seems that I could estimate adding in about 200k
new records every day.

So I'm having several issues. One is that inserting 500 items takes
about 30 seconds, which turns out to be a lot of CPU-api time. The
second is that after loading up the 1.45 Million objects the datastore
usage has ballooned to 500 megabytes. The raw data is only on the
order of 30 megabytes, this is about the size that I can store it and
the size that the datastore statistics says it takes up. The datastore
statistics also says that about 108mb exists as metadata. I imagine
this is the default index on key, and I imagine it is a reasonable
value.

However, on the main app of my page it says I'm taking up about 500
megabytes when the datastore statistics only account for about 140mb.
Is this normal?

Should I expect the rest of my data to balloon about the same? (i.e.
2.5 gigabytes of raw data to require 65 gigabytes pre-indexes)

If so... Well that really sucks.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appeng...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to