Hi Michael,
You have to do some test with your data to figure it out.
I have small entities but use transactions (no groups), to save i
havely use taskqueues and tune how often they run per minute to keep
inside the quota.
With http://code.google.com/intl/nl/appengine/docs/python/tools/appstats.htm
Hi Michael,
You'll be able to insert at a much higher rate than 4 / second.
Where is sounds like you could have some difficulties is at
query-time. How often do you need to perform fetches of that size?
Will the be a fetch by key, or will you actually have to query
something? There might be
These were very small entites. Just a key only.
On Wed, Jan 26, 2011 at 3:50 PM, Stephen Johnson wrote:
> I posted this in a different thread for a different reason but here are
> some numbers on batch putting small entities. Times will vary depending on
> size of entity, number of properties and
I posted this in a different thread for a different reason but here are some
numbers on batch putting small entities. Times will vary depending on size
of entity, number of properties and indexes on the properties and composite
indexes. The first number is the latency or the real world time it took
No, the 4 writes per second (rule of thumb) applies to an entity group. If
the items are all in separate entity groups then that doesn't apply.
On Wed, Jan 26, 2011 at 3:25 PM, Michael McClain <
michael.c.mccl...@gmail.com> wrote:
> So I can only insert 4 items per second in the database?
>
> I n
So I can only insert 4 items per second in the database?
I need to insert 1000 items simultaneously for each new document I'm
inserting in the system...
Is there a way to do this?
Thank you,
Michael
On Jan 26, 5:36 pm, Wim den Ouden wrote:
> Hi Michael,
> You need to spread the load as much as