On Mon, 2008-09-22 at 18:22 -0700, cfobel wrote:
> Hello,
> 
> I'm encountering an error when performing a lookup that spans
> relationships.  The query is as follows:
> 
> myitems =
> MyItem.objects_all.exclude(user__somemodel__created__gte=(datetime.now()
> - timedelta(days=3)))
> 
> With the following (stripped) models:
> 
> class MyItem(models.Model):
>     user = models.ForeignKey(user, unique=True)
> 
> class SomeModel(models.Model):
>     created = models.DateTimeField('Date created',
> default=datetime.now)
>     user = models.ForeignKey(User)

This is a pretty good test case, but the looks of it. The only thing
missing is the custom manager (since you're calling MyItem.objects_all,
not MyItem.objects) and that possibly affects something, too.

In any case, any time we're generating invalid SQL instead of raising
some other error (or doing the right thing for valid code -- and the
above looks valid at first glance), it's a bug.

Could you please open a ticket for this so that I remember to look at it
(assign it to "mtredinnick" straight away, if you like).

> 
> The error I get is:
> 
> Traceback (most recent call last):
>   ....
>   File "/usr/lib/python2.5/site-packages/django/db/models/query.py",
> line 179, in _result_iter
>     self._fill_cache()
>   File "/usr/lib/python2.5/site-packages/django/db/models/query.py",
> line 612, in _fill_cache
>     self._result_cache.append(self._iter.next())
>   File "/usr/lib/python2.5/site-packages/django/db/models/query.py",
> line 269, in iterator
>     for row in self.query.results_iter():
>   File "/usr/lib/python2.5/site-packages/django/db/models/sql/
> query.py", line 206, in results_iter
>     for rows in self.execute_sql(MULTI):
>   File "/usr/lib/python2.5/site-packages/django/db/models/sql/
> query.py", line 1723, in execute_sql
>     cursor.execute(sql, params)
>   File "/usr/lib/python2.5/site-packages/django/db/backends/util.py",
> line 19, in execute
>     return self.cursor.execute(sql, params)
> psycopg2.ProgrammingError: missing FROM-clause entry in subquery for
> table "u1"
> LINE 1: ..._myitem" U0 INNER JOIN "notes_note" U2 ON (U1."id" = ...
>  
> ^
> 
> To debug the issue, I looked at the SQL generated by the 'myitems'
> queryset above.  The generated SQL is:
> 
> SELECT "users_myitem"."id", "users_myitem"."user_id"
> FROM "users_myitem"
> WHERE NOT (
>   "users_myitem"."user_id" IN (
>     SELECT U2."user_id"
>     FROM "users_myitem" U0
>     INNER JOIN "myapp_somemodel" U2 ON (U1."id" = U2."user_id")
>     WHERE U2."created" >= 2008-09-19 19:57:43.111687
>   )
> )
> 
> It looks like the table "users_myitem" is being improperly labeled as
> 'U0',

Well, not really. The users_myitem table in the main query and
users_myitem in the subquery are different instances of the table, so it
has to have an alias. The main table from the outer query is always
going to end up being "U0" in an inner query like this (since it's the
first in a list of names and the number is the index in the list).

>  and then is referred to as 'U1' on the next line. 

The interesting bit is what happened to U1. I have a suspicion this
might be related to another optimisation that isn't working reliably,
but which I thought wasn't harming anything (just generating less than
optimal code): we shouldn't really need U0 in the above query, since we
can just test against the thing it's joining to.

There's definitely something going wrong here. Again, we shouldn't ever
be sending malformed SQL to the database. That's always a Django bug. I
can't drop everything right this minute to look at it, but if you open a
ticket I'll certainly take a look, since SQL bugs are something I take
very personally.

Regards,
Malcolm



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to