It appear that i've sorted this out in the last crazy test, my problem 
where in filter fields, this was my code before:

 accion = 
self.dbNueva.estatus_contratacion.validate_and_insert(**self.dbNueva.estatus_contratacion._filter_fields(x)).as_dict()

Where x is a dict with all the fields i want to insert (including the id of 
the field), but when i insert this, the id is ignored an its assigned the 
next autoincrement id, i changed the code for this:

accion = self.dbNueva.estatus_contratacion.validate_and_insert(id=x["id"], 
**self.dbNueva.estatus_contratacion._filter_fields(x)).as_dict()

And it work perfectly. i dont know if this is a bug or this is the way it 
should work, but ._filter_fields appears to filter also the ID of the dict.

Thanks for all your help anthony, i hope this can help other people.

El lunes, 10 de octubre de 2016, 10:42:19 (UTC-4), Anthony escribió:
>
> OK, so what does your code look like? Why can't you preserve the id's?
>
> Anthony
>
> On Monday, October 10, 2016 at 9:43:50 AM UTC-4, 
> luis.va...@metamaxzone.com <javascript:> wrote:
>>
>> I thin that method doesnt work for me, let me explain better what i want 
>> to do:
>>
>> 6 months ago i released a production APP, now we will launch the second 
>> version of this app and there are several (big) differences between the 
>> database structure in production and the one i will deploy now, but i dont 
>> want to lose all the data collected in production, so what i want to do can 
>> be resumed in 3 steps:
>>
>> 1. Get all the information from the production database with the OLD 
>> structure
>> 2. Format this information to the NEW structure (some fields will be 
>> deleted, other will be added... there are several changes)
>> 3. Insert this data in the NEW structure database.
>>
>> Also, this app is created with a microservices architecture which means 
>> that i dont have only one database, instead i've 14 database that have 
>> cross data between them (i mean, the id stored for a user in the users 
>> database are also used in several other databases as reference to the user)
>>
>> So because this if i change the ids i will need to make a big coherence 
>> check around my 14 databases, with (approximately) 10 tables each one, this 
>> is a big deal as you can see.
>>
>> Thanks a lot for the help and i hope you can keep helping me.
>>
>> El jueves, 6 de octubre de 2016, 14:19:49 (UTC-4), Anthony escribió:
>>>
>>> Do you really need the id's to remain the same? Note, if you use this 
>>> method 
>>> <http://web2py.com/books/default/chapter/29/06/the-database-abstraction-layer#CSV--all-tables-at-once->
>>>  
>>> to migrate, web2py will automatically update any reference fields so they 
>>> contain the new ids of referenced records.
>>>
>>> Anthony
>>>
>>> On Wednesday, October 5, 2016 at 5:47:36 PM UTC-4, 
>>> argenio...@metamaxzone.com wrote:
>>>>
>>>> I'm doing a migrate data of 2 databases with diferents tables.
>>>>
>>>> My only problems is with ids! 
>>>>
>>>> table 1
>>>> id: 5, name: Diana
>>>> id: 8, name: Carlos
>>>>
>>>>
>>>> when i migrate this data to databases 2
>>>> id: 1, name: Diana
>>>> id: 2, name: Carlos
>>>>
>>>> Look the id's not the same.
>>>>
>>>> what is the method to save the same id?
>>>> I read about primarekey parameters but dont work.
>>>>
>>>> The process of migrate if a script with 2 Pydal, one for the old 
>>>> database and other for new databases.
>>>> I execute one select() of everithing (not much data), later convert to 
>>>> the new structure and all work perfectly. the only problem is with ID .
>>>>
>>>> Please help me, thanks.
>>>>
>>>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to