Re: failure to load fixtures during unit tests

2016-05-09 Thread Rich Rauenzahn
FYI, I was finally able to resolve this.  I had an assumption that 
TestCase's began with a freshly created database between TestCase classes. 
 I asserted this in _fixture_setup() and found that assumption to be false. 
 Upon further research I found a setUpClass() in another TestCase that 
created objects that conflicted with the fixtures loaded in other TestCase 
classes.  Somehow Django-nose's FastFixtureTestCase hid this problem.

What I'm not sure about (and need to investigate) is if Django-nose's 
TestRunner is the optimizing away fresh databases between TestCase classes, 
or if that is standard.

Rich

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/f74ac850-26d4-4db2-8aef-1d595f8279b7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: failure to load fixtures during unit tests

2016-05-05 Thread Rich Rauenzahn

Thanks, Tim.

Unfortunately I can't move past Django 1.7 yet -- dependencies.  I've been 
marching my way up one revision at a time hopefully up to 1.9 as a way to 
keep the scope of what breaks under control as I move through each major 
revision and stabilize my project.  Then I attack replacing dependencies.

I really think I've found a bug here ... which I hope to suggest a patch 
for and submit, hence the post to the developers channel, but I can go back 
to the users group for now... My recent experience with that list doesn't 
bode well, however, and I don't have high hopes with anyone there able to 
respond at the internals level I may need to track down the issue.  I've 
almost rewritten my tests to just load raw sql, but if there is a bug here 
I'd like to help find it rather than work around/ignore it.

As I step through the code, it really looks like the _save_table() method 
in Model is trying to insert a row even though the object has already been 
restored/inserted.  At the moment, I'm reproducing it with the auth.User 
Model.

I'm getting closer to seeing what is happening 

I have a user, rich, which expects to be pk=1 per the fixture.  

> 
/opt/perfcat/virtualenv-2.7.11/lib/python2.7/site-packages/django/db/models/base.py(686)_save_table()uest_by_build_workflow_fail
 
  
685 import ipdb; ipdb.set_trace()
--> 686 if not updated:
687 if meta.order_with_respect_to:

ipdb> self.id
1
ipdb> self.__class__.objects.all()
[]
ipdb> self.__class__.objects.all()[0].id
5
ipdb> self.username
u'rich'
ipdb> 


But In this particular run I'm currently tracing, rich is already in the db 
(as the only entry) as pk=5 (via fixture loading process).   For one, this 
tells me the sequence generators aren't always resetting between fixture 
loads/tests.

So I think the code is trying to reassign it to pk=1.  

We did drop into the update code,

ipdb> pk_set and not force_insert
True

But updated is False

ipdb> updated
False

So now it tries to drop into an insert, but it is going to get an Integrity 
error because username has to be unique.

Not sure what this means, yet, but my current step through looks like this:

ipdb> 
IntegrityError: Integrit...sts.\n',)
> 
/opt/perfcat/virtualenv-2.7.11/lib/python2.7/site-packages/django/db/models/base.py(700)_save_table()
699 update_pk = bool(meta.has_auto_field and not pk_set)
--> 700 result = self._do_insert(cls._base_manager, using, 
fields, update_pk, raw)
701 if update_pk:

ipdb> update_pk
False
ipdb> meta.has_auto_field
True
ipdb> pk_set
True
ipdb> 

...if we don't need to update the pk, and it is set .. why are we inserting 
it? 

Walking through a second time with this knowledge ... and stepping into 
_do_update(),

I end up with filtered = base_qs.filter(pk=pk_val) being equal to [] 
because the entry in the db has a pk=5, and it is filtering for pk=1

So when return filtered._update(values) > 0 returns, it returns false 
because nothing was updated because the pk's didn't match.

Where I am stuck at now is not understanding how fixture loading manages 
the pks... 

Rich

On Wednesday, May 4, 2016 at 4:22:33 PM UTC-7, Tim Graham wrote:
>
> Hi Rich, django-users is the appropriate place to ask "is it a bug?" type 
> questions. We try not to use this mailing list as a second level support 
> channel, otherwise it'd get really noisy. Thanks for understanding.
>
> By the way, Django 1.7 is no longer supported. Please make sure you can 
> reproduce the issue on Django master so we don't spend time debugging 
> issues that have since been fixed.
>
> On Wednesday, May 4, 2016 at 7:13:42 PM UTC-4, Rich Rauenzahn wrote:
>>
>>
>> I'm in the middle of trying to track down a problem with loading fixtures 
>> during unit tests -- I'm hesitant to call it a bug in Django 1.7, but the 
>> inconsistent behavior is really stumping me.
>>
>> Essentially I've made a fixture via
>>
>>manage dumpdata --indent=3 -e sessions -e admin -e contenttypes -e 
>> auth.Permission > test-fixtures.json
>>
>> If I add that fixtures to my TestCase, it sometimes works if I run each 
>> test individually (using Django Nose)  -- 
>>
>>manage test --failfast test_it:TestClass.test_detail
>>manage test --failfast test_it:TestClass.test_list
>>
>> But if I run them together, 
>>
>>manage test --failfast test_it:TestClass
>>
>> I get errors about duplicate/unique problems.  Essentially a row is 
>> attempted to be added twice. 
>>
>> IntegrityError: Problem installing fixture 'test-fixtures.json': 
>> Could not load app.Branch(pk=1): duplicate key value violates unique 
>> constraint &q

failure to load fixtures during unit tests

2016-05-04 Thread Rich Rauenzahn

I'm in the middle of trying to track down a problem with loading fixtures 
during unit tests -- I'm hesitant to call it a bug in Django 1.7, but the 
inconsistent behavior is really stumping me.

Essentially I've made a fixture via

   manage dumpdata --indent=3 -e sessions -e admin -e contenttypes -e 
auth.Permission > test-fixtures.json

If I add that fixtures to my TestCase, it sometimes works if I run each 
test individually (using Django Nose)  -- 

   manage test --failfast test_it:TestClass.test_detail
   manage test --failfast test_it:TestClass.test_list

But if I run them together, 

   manage test --failfast test_it:TestClass

I get errors about duplicate/unique problems.  Essentially a row is 
attempted to be added twice. 

IntegrityError: Problem installing fixture 'test-fixtures.json': Could 
not load app.Branch(pk=1): duplicate key value violates unique constraint 
"app_branch_name_49810fc21046d2e2_uniq"
DETAIL:  Key (name)=(mock) already exists.

(I've also posted this earlier today on django-users, where I also included 
some postgres output).  The tests within the TestCase (or 
TransactionTestCase) can be empty ("pass") and still reproduce.

As best I can tell it doesn't only happen when combined -- sometimes I can 
get it to happen in a class with a single TestCase.  And it isn't always 
the same model that has the conflict.

Has anyone seen anything like this behavior before?  It's as if sometimes 
the fixtures are installed in different order each time, which makes me 
think of some dict.keys() that doesn't return the same order every time.

Rich



-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/37075277-2390-476c-9d32-3b3e594d2e4f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: unique_together does not work as expected with nullable fields

2016-05-02 Thread Rich Rauenzahn


On Friday, April 29, 2016 at 11:00:37 AM UTC-7, Aymeric Augustin wrote:
>
> Hi Rich,
>
> On 29 Apr 2016, at 19:52, Rich Rauenzahn <rrau...@gmail.com > 
> wrote:
>
>
> I see now that I need to provide a sentinel value -- 
> BOO=True,VAL=, or manually create additional unique indexes.
>
>
> Indeed, you should write a migration with a RunSQL operation that creates 
> a unique index on boo where boo = true. Then you can have only one row with 
> boo = True.
>
 

It sounds like my request should probably be appended to this ticket, then: 
 https://code.djangoproject.com/ticket/11964 (Add the ability to use 
database-level CHECK CONSTRAINTS)

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/7bfd2862-72bc-4df7-bded-a9513fe009b1%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: unique_together does not work as expected with nullable fields

2016-04-29 Thread Rich Rauenzahn


On Friday, April 29, 2016 at 2:16:45 AM UTC-7, Anssi Kääriäinen wrote:
>
> If you really, really want an unique index that allows just a single 
> value, you might want to try unique index on (a, b, c) where c is not 
> null, and another unique index on (a, b) where c is null. That might 
> give the results you are looking for, though I haven't tested this. 
>

What I'm suggesting is a way to express that index within Django, similar 
to unique_together (and perhaps a warning in the docs, given the frequency 
is comes up on stackoverflow.)

I see now that since multi column indexes are an extension of a single 
column index, it makes sense -- you'd never want a single column index to 
only have one null value.

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/c3b2e562-1e1e-4efd-89d2-eb82c4eb13da%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: unique_together does not work as expected with nullable fields

2016-04-29 Thread Rich Rauenzahn


On Friday, April 29, 2016 at 12:51:31 AM UTC-7, Florian Apolloner wrote:
>
>
> I am not against a note in the docs, but I find the fact that nulls are 
> not "unique" and can exist in an index more than once very useful (fwiw 
> ordering after a column with null can also be interesting across 
> databases). I'd be interested to hear about your use case -- the "general" 
> use case is usually that you have an optional column but want to ensure it 
> is unique as soon as it is filled…
>
>
Let's see if I can explain my use case without having to explain my whole 
domain ... 

I have a Model where it has a boolean field, "BOO".  When "BOO" is False, 
another field "VAL" should have a meaningful value, otherwise NULL.  VAL is 
the only nullable field.  (And yes, the boolean is actually superfluous, 
"VAL" is sufficient for the logic.)

I want the PK of the Model to always be unique combined with BOO=False and 
VAL (and a couple of other non-nullables).  But I also don't want duplicate 
values of PK,BOO=True,VAL=null, which I am currently getting.  Put another 
way, only one row should have PK,Boo=True for each PK, but I can have many 
PK,Bool=False

I see now that I need to provide a sentinel value -- 
BOO=True,VAL=, or manually create additional unique indexes.

Since it is conceivable for Django to create the right indexes to handle 
the null case, it would be nice to somehow be able to explicitly ask for 
what I want expressed in Django.  (unique_together obviously can't change 
its current default behavior.)

Is that helpful?

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/75e8e4ca-528d-4cce-8072-30f69a5d58f8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


unique_together does not work as expected with nullable fields

2016-04-28 Thread Rich Rauenzahn

I just got bitten by this today, finding a duplicate row where I didn't 
expect one.  I haven't been able to find an existing Django bug.

It's a popular topic on stack overflow:

http://stackoverflow.com/questions/17510261/django-unique-together-constraint-failure
http://dba.stackexchange.com/questions/9759/postgresql-multi-column-unique-constraint-and-null-values


This is apparently an expected (and standardized) thing in SQL that ('A', 
'B', NULL) is unique to ('A', 'B', NULL) as NULL is never equal to another 
NULL.

There is a workaround at the SQL level of ... 

CREATE UNIQUE INDEX ab_c_null_idx ON my_table (id_A, id_B) WHERE id_C IS NULL;


I'm wondering if this ought to at least be addressed in a runtime warning, 
or at least documentation in unique_together -- and I'm hoping that perhaps 
a Django level workaround could be devised to explicitly ask for unique 
indexes accommodating null values.

For myself, I'm writing a unittest to fail if any of my unique_together's 
have a nullable field and using a specific value as my "null" value for now.

Thoughts?  Has this come up before?


-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/566d247e-4aae-429e-9cc3-2544c82ce9a3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.