I don't know your intended use of these ID's, my thoughts here are limited
to assumed use, feel free to ignore thoughts that are off base for your use
case. 

 

If, when you query for the IDs you are looking for *all* the IDs, then just
serialize them into one field and retrieve them as one record and
de-serialize them in a way that doesn't require they all fit into memory at
the same time (a tokenized CSV list is most straight forward example, but
you can do more compact serializations).

 

If you need to query for some subset of these IDs, then storing them in the
datastore is indeed the way to go I suspect. You can batch many
inserts/updates. You'll have a large table, but that isn't likely to be a
problem with this data store, but do test it. If lookup times degrade with
size you could consider partitioning your users into different groups
(simple example: 1 group of users IDs that end in even #'s, another that
ends in odd #'s), this can reduce the size of indexes and improve
performance on some systems (I don't have personal experience to tell you
whether this is necessary in this system, but it's a thought to consider).

 

Again, I just offer this as food for thought. If you describe your intended
access patterns it will probably help guide the discussion. Good luck.

 

 

From: google-appengine@googlegroups.com
[mailto:google-appengine@googlegroups.com] On Behalf Of nischalshetty
Sent: Tuesday, April 19, 2011 1:15 PM
To: google-appengine@googlegroups.com
Subject: [google-appengine] Appropriate way to save hundreds of thousands of
ids per user

 

Every user in my app would have thousands of ids corresponding to them. I
would need to look up these ids often.

Two things I could think of:

1. Put them into Lists - (drawback is that lists have a maximum capacity of
5000(hope I'm right here) and I have users who would need to save more than
150,000 ids)
2. Insert each id as a unique record in the datastore (too much of data? as
it would be user * ids of all users). Can I batch put 5000 records at a
time? Can I batch get at least 100 - 500 records at a time?

Is there any other way to do this? I hope my question's clear. Your
suggestions are greatly appreciated.

-- 
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.

  _____  

No virus found in this message.
Checked by AVG - www.avg.com
Version: 10.0.1209 / Virus Database: 1500/3582 - Release Date: 04/18/11

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to