You are doing something wrong....a different model, some strange value 
type...etc.
Tested with dict automatic unpacking.

>>> movie = dict(name='alien4', genres=[1,2])
>>> db.movies.validate_and_insert(**movie)
<Row {'errors': <Row {}>, 'id': 4}>
>>> movie = dict(name='alien4', genres=[1,7])
>>> db.movies.validate_and_insert(**movie)
<Row {'errors': <Row {'genres': 'value not in database'}>, 'id': None}>


Could you at least print one movie you're trying to insert that fails in 
your script ? standard print, simplejson.dump()ed and pprint.pprint()ed, 
directly before "validate_and_insert(**movie)". Just to pinpoint the issue.

On Wednesday, August 22, 2012 10:10:40 PM UTC+2, Mike Girard wrote:
>
> Yeah, I get the references directly from the genre/person tables before 
> inserting them in movie. I also test to make sure the item being inserted 
>  is a list with data. Printing contents after the error shows nothing odd. 
>
> I have tested this in both Postgres and SQLlite. Same thing happens in 
> both.
>
>
>
> On Wednesday, August 22, 2012 3:45:31 PM UTC-4, villas wrote:
>>
>> Did you try checking that all the references exist?  
>> So, maybe it is a referential integrity problem.
>> You are prob not using Sqlite but a proper DB.
>> Just ideas,  D
>>
>>
>> On Wednesday, August 22, 2012 8:23:23 PM UTC+1, Mike Girard wrote:
>>>
>>> Yeah. A result of switching it off and then back on. 
>>>
>>> Good catch. 
>>>
>>> On Wed, Aug 22, 2012 at 3:19 PM, Anthony <abas...@gmail.com> wrote:
>>>
>>>> I assume this is a typo too, but just to be sure:
>>>>
>>>> Field('genres','string','list:reference genre', ...
>>>>
>>>> You have both 'string' and 'list:reference' there.
>>>>
>>>> Anthony
>>>>
>>>>
>>>> On Wednesday, August 22, 2012 2:49:29 PM UTC-4, Mike Girard wrote:
>>>>
>>>>> I have written a script to parse a large xml file and insert the 
>>>>> contents in to my app db. I am using lxml.
>>>>>
>>>>> After about 10 records get inserted, the script fails with 
>>>>> <type 'exceptions.TypeError'> argument of type 'NoneType' is not 
>>>>> iterable
>>>>>
>>>>> Troubleshooting determined the following:
>>>>>
>>>>> 1. This error is associated with 3 list:reference fields. When I 
>>>>> remove them from the script, the script executes uneventfully. If any one 
>>>>> of them is included, it fails.
>>>>> 2. This only happens after 10 records have been successfully inserted.
>>>>> 3. There is no discernible difference between the records that get 
>>>>> successfully added and those that don't. The error happens even when I 
>>>>> hard 
>>>>> code the lists for the list:reference field. It seems to be associated 
>>>>> with 
>>>>> number of records, rather than which records. 
>>>>> 4. The script executes successfully when I change the field types from 
>>>>> 'list:reference' to 'string' and insert strings instead of lists. You 
>>>>> should not assume from this that there is a data issue. As I said, 
>>>>> hardcoded lists get rejected also. I am 99% certain valid data is not the 
>>>>> issue.
>>>>> 5. This happens in both SQLLite and Postgres
>>>>>
>>>>> Here is the model declaration for one of the three fields. They are 
>>>>> all analogous:
>>>>>
>>>>> Field('genres','string','list:**reference genre', 
>>>>> requires=IS_IN_DB(db, 'genre.id', '%(name)s [%(id)s]', multiple=True))
>>>>>
>>>>> Here is how I update each new row in the  the database:
>>>>>
>>>>> db.movies.validate_and_insert(****movie) (movie is a dict)
>>>>>
>>>>> Here is how I hardcoded values into the fields: movie['genre'] = {456, 
>>>>> 368, 239]
>>>>>
>>>>> Now, if someone doesn't have a solution, can they tell me if I can 
>>>>>
>>>>> 1.Programmatically remove the list:reference from the model prior to 
>>>>> data updates and programmatically restore it afterwards?
>>>>> 2. Retain all the functionality of these fields by toggling this way? 
>>>>>
>>>>> Seriously considering going the join table route and skipping the list 
>>>>> reference fields. Are there any gotchas there? 
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>  -- 
>>>>  
>>>>  
>>>>  
>>>>
>>>
>>>

-- 



Reply via email to