Tzakie,
As Dan pointed out, GqlQuery isn't actually fetching the data when
constructed. Try this instead:
CategoryRows = db.GqlQuery(QueryString)
results = CategoryRows.fetch(limit)
where limit is the max number of rows to fetch. I believe this will
make a single trip the datastore rather
I believe this will
make a single trip the datastore rather than once every 20 objects
(which is done to make using intrinsic iteration of a GqlQuery
performant). I'd be curious to see what, if any, gain this yields
with your dataset.
Takes exactly the same amount of time. I read through
For the record, I regularly pull 100 entities, that are much smaller
than yours, for my application in order to page through them.
Basically to meet a paging requirement I pull 100, cache that result,
then page within it. I do think they size of your entities are part of
the problem
Though, I
Tzakie,
A few things:
1. What is the shape of your data? i.e. how big are your entities and
what do they contain?
2. How are you using entity groups? Is all your data in a single
entity group (shared ancestor)?
3. Also try to experiment with permutations of your index and query to
try to
1. What is the shape of your data?
i.e. how big are your entities and what do they contain?
class USER_INVENTORY(db.Model):
TUSER_ID = db.IntegerProperty()
ITEM_TYPE = db.StringProperty()
ITEM_ID = db.IntegerProperty()
SUFFIX_ID = db.IntegerProperty()
HAVE_INVENTORY =
Looking at it more deeply every 20th one takes a long time. I assume
that's the data fetch.
Still need a way to optimize...
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
Google App Engine group.
To post to this
If you only need some of the properties for the query that needs 100+
results, you'll need to create a separate set of entities with just those
properties, and query those. Similarly, if you want the query to return
just the keys, you'll need entities containing the properties that are the
Similarly, if you want the query to return
just the keys, you'll need entities containing the properties that are the
subjects of query filters and the keys for the full entities.
so there is no way to
QueryString=SELECT * FROM USER_INVENTORY WHERE TUSER_ID=432 AND
CATEGORY_ID=23423
BigTable is an object store, so you can't ask it for specific fields
like you can in a traditional database. In the future we may get new
tools from Google that operate in cloud data and return a result
(MapReduce, etc.), but not yet.
You might try (like Dan said) breaking your entities up into
There is currently no way to retrieve only parts of entities, nor just keys,
from the datastore in response to queries. There's no way to dig out the
keys either: the datastore returns the full entities in response to queries,
there is no intermediate app-side step that fetches entities for keys.
Looking at it more deeply every 20th one takes a long time. I assume
that's the data fetch.
I guess you are iterating over a Query or GqlQuery object to get the
entities? This explains explain why every 20th iteration. From
http://code.google.com/appengine/docs/datastore/queryclass.html :
Hey Dan and Boson,
Thanks very much for your time guys.
When you call:
CategoryRows = db.GqlQuery(QueryString)
It's fast.
It's the loop where you try and access the returned entities that
is slow. Doesn't it follow that the CategoryRows has the keys in it
and the next() in the python loop is
On Thu, Jan 8, 2009 at 3:35 PM, Tzakie gwood...@comexton.com wrote:
Thanks very much for your time guys.
No problem, happy to help.
When you call:
CategoryRows = db.GqlQuery(QueryString)
It's fast.
It's the loop where you try and access the returned entities that
is slow. Doesn't it
Quite frankly, I can't think of a Google web app that displays 100 of
anything all at once...
I'm getting the impression that people think what I'm asking for is
ridiculous
and off the radar. I sent you an e-mail with the urls of the current
app and
what I am working on. When you see it in
On Jan 9, 4:15 am, Tzakie gwood...@comexton.com wrote:
[...]
Can't you guys make something that just returns the keys from a query?
That seems consistent with how I think big table works.
I'm pretty sure there should be a solution for this as BigTable is
basically a distributed hashmap, so
On Jan 9, 1:22 am, Alexander Kojevnikov alexan...@kojevnikov.com
wrote:
Looking at it more deeply every 20th one takes a long time. I assume
that's the data fetch.
I guess you are iterating over a Query or GqlQuery object to get the
entities? This explains explain why every 20th
Repeating a couple of things I mentioned to Greg in email, for the list
discussion:
A single-server SQL database with low load and tables with rows numbering in
the thousands can do lots of cool things with that data in a reasonable
amount of time. But many of the features of a SQL query engine
17 matches
Mail list logo