On Saturday 12 July 2008 23:02:15 Rick Morrison wrote:
> I've been thinking about a feature like this for some time as well,
> but haven't found the right way to describe it. Seems like it's
> time to take a stab at it. This is likely to be half-baked at best:
>
> It's something in between a full eager load and a lazy load. I call
> it "vector load", but it's essentially an in-memory join. Here's
> how it could work:
>
> Suppose you have a list result of a simple query:
>
>         vector = S.query(MappedObj).filter(foo).all()
>
> and assume "MappedObj" has a simple relation "children", each of
> those in turn having a simple relation "grandchildren". It would be
> nice to say something like
>
>         S.vectorload(vector, 'children.grandchildren')
>
> and have *all* the children + grandchildren loaded for each item in
> 'vector'.
>
> SA already has most of the join aliasing machinery to issue such a
> query and do the in-memory join.  But......the problem is
> restricting the set of 'children' to match those of 'vector' -- the
> seemingly easiest thing would be to use a IN clause list of the
> children's foreign key, like so:
>
>       WHERE children.foreign_key IN ([obj.primary_key for obj in
> vector])
>
> the issue is, the list of keys for the IN clause could be huge, and
> many database engines perform poorly with huge lists of literals in
> an IN clause, notably Postgresql, which at least until recently had
> quadratic behavior with large IN lists.
>
> Still for small sets, it might be a interesting feature.
i think i have something like this expanded-tree-loading using 
generated nested OR/AND "joins", well it is ugly... (see the "how 
query.count() works?" thread) but sort-of works, at least for levels 
up to 4. the level of nesting is killing it.. 

i will need a way to somehow load another simpler (sub)tree in one 
shot.

> BTW it's kind of harder than it would seem to "roll your own" for
> this one: If you take the naive approach to  just issue the query
> for children with an eager load of grandchildren and do the
> in-memory join in a loop:
>
>     keys = [item.id for item in vector]
>     all_children =
> S.query(ChildObj).options(eagerload('grandchildren')).filter(ChildO
>bj.foreign_key.in_(keys)) dvec = dict((v.id, v) for v in vector)
>     for child in all_children:
>         dvec[child.foreign_key].children.append(child)
>
> you'll find you trigger an unwanted lazy-load by issuing the
> children.append operation.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to