scratch that, i found one, try r3512.

On Sep 24, 2007, at 9:28 AM, Michael Bayer wrote:

>
> ive grepped all the code and cannot find a create_row_processor
> method that is returning a 2-tuple (which is what it used to be);
> theyre all returning 3-tuples..and of course all those methods have
> test coverage too.   have you tried deleting all the .pyc files to
> ensure all python modules have recompiled ?
>
>
> On Sep 24, 2007, at 1:39 AM, [EMAIL PROTECTED] wrote:
>
>>
>> hi.
>> r3506 is still ok, while r3507 gives this:
>> result: [] expected: [35]
>> SAMPLE: 2006-09-11 00:00:00 2006-09-12 00:00:00 2006-09-14 00:00:00
>> [35]
>> 'trans exact, valids between _2' FROM test_range TimedRangeTestCase
>> --------------------------------------------------------------------- 
>> -
>> Traceback (most recent call last):
>>   ...
>>   File "test_timed_dbcook.py", line 142, in _get_range_val
>>     q = me.val.get_obj_history_in_range( me.val.OBJ_ID, timeFrom,
>> timeTo).all()
>>   File "/home/az/src/hor-trunk/sqlalchemy/orm/query.py", line 571, in
>> all
>>     return list(self)
>>   File "/home/az/src/hor-trunk/sqlalchemy/orm/query.py", line 619, in
>> __iter__
>>     return self._execute_and_instances(statement)
>>   File "/home/az/src/hor-trunk/sqlalchemy/orm/query.py", line 624, in
>> _execute_and_instances
>>     return iter(self.instances(result))
>>   File "/home/az/src/hor-trunk/sqlalchemy/orm/query.py", line 680, in
>> instances
>>     self.select_mapper._instance(context, row, result)
>>   File "/home/az/src/dbcook/sqlalchemy/orm/mapper.py", line 1436, in
>> _instance
>>     self.populate_instance(context, instance, row, **flags)
>>   File "/home/az/src/dbcook/sqlalchemy/orm/mapper.py", line 1496, in
>> populate_instance
>>     (newpop, existingpop, post_proc) =
>> prop.create_row_processor(selectcontext, self, row)
>> ValueError: need more than 2 values to unpack
>>
>> the result query is somewhat awwful, but it works before that:
>>
>> ---------- TEST trans exact, valids between _2* SA: INFO
>> SELECT "PolymBase".disabled AS "PolymBase_disabled", "PolymBase".val
>> AS "PolymBase_val", "PolymBase".time_valid
>> AS "PolymBase_time_valid", "PolymBase".time_trans
>> AS "PolymBase_time_trans", "PolymBase".obj_id
>> AS "PolymBase_obj_id", "PolymBase".atype
>> AS "PolymBase_atype", "PolymBase".db_id
>> AS "PolymBase_db_id", "PolymLeaf".db_id AS "PolymLeaf_db_id"
>> FROM (
>>     SELECT max("PolymBase".db_id) AS db_id, "PolymBase".time_trans AS
>> time_trans, "PolymBase".time_valid AS time_valid
>>     FROM "PolymBase" JOIN (
>>         SELECT max("PolymBase".time_trans) AS
>> time_trans, "PolymBase".time_valid AS time_valid
>>         FROM "PolymBase"
>>         WHERE "PolymBase".time_trans <= ? AND "PolymBase".time_valid
>>> = ? AND "PolymBase".time_valid <= ? AND "PolymBase".obj_id = ?
>> AND "PolymBase".atype = ?
>>         GROUP BY "PolymBase".time_valid
>>     ) AS t1 ON "PolymBase".time_valid = t1.time_valid
>> AND "PolymBase".time_trans = t1.time_trans
>>     WHERE "PolymBase".obj_id = ? AND "PolymBase".atype = ?
>>     GROUP BY "PolymBase".time_valid, "PolymBase".time_trans
>> ) AS timedr, "PolymBase" JOIN "PolymLeaf" ON "PolymLeaf".db_id
>> = "PolymBase".db_id
>> WHERE "PolymBase".db_id = timedr.db_id AND NOT "PolymBase".disabled
>> ORDER BY "PolymBase".time_valid
>> * SA: INFO [11, 12, 14, 1, 'PolymLeaf', 1, 'PolymLeaf']
>>
>>
>> any idea? i mau try prepare some stripped testcase but it'll take
>> time...
>> if u wanna try, svn co then run make in
>> https://dbcook.svn.sourceforge.net/svnroot/dbcook/trunk/dbcook/misc/
>> timed2/test/
>> or do
>> PYTHONPATH=..:../../../..:../../..:$(PYTHONPATH) python
>> test_timed_dbcook.py
>>
>> svil
>>
>>>
>
>
> >


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to