Re: Duplicate object

2006-01-12 Thread Robert Wittams
Adrian Holovaty wrote: > On 1/12/06, Jacob Kaplan-Moss <[EMAIL PROTECTED]> wrote: > >>On Jan 12, 2006, at 8:09 AM, Simon Willison wrote: >> >>>Maybe it would be useful for all Django model objects to gain >>>themselves a duplicate() method which does exactly this - returns >>>an identical object

Re: Duplicate object

2006-01-12 Thread Adrian Holovaty
On 1/12/06, Jacob Kaplan-Moss <[EMAIL PROTECTED]> wrote: > On Jan 12, 2006, at 8:09 AM, Simon Willison wrote: > > Maybe it would be useful for all Django model objects to gain > > themselves a duplicate() method which does exactly this - returns > > an identical object but with id set to None. > >

Re: Duplicate object

2006-01-12 Thread Jacob Kaplan-Moss
On Jan 12, 2006, at 8:09 AM, Simon Willison wrote: Maybe it would be useful for all Django model objects to gain themselves a duplicate() method which does exactly this - returns an identical object but with id set to None. +1 -- obj.copy() might be a better syntax to mimic dict.copy(). Ja

Re: Duplicate object

2006-01-12 Thread Dody Suria Wijaya
Simon Willison wrote: Maybe it would be useful for all Django model objects to gain themselves a duplicate() method which does exactly this - returns an identical object but with id set to None. There is still complication for non autofield primary key. And it's just 3 lines or under. --

Re: Duplicate object

2006-01-12 Thread Simon Willison
On 12 Jan 2006, at 05:43, Eric Walstad wrote: The following approach ('shallow' copy) has worked well for me: import copy b = copy.copy(a) b.id = None b.save() Maybe it would be useful for all Django model objects to gain themselves a duplicate() method which does exactly this - returns an

Re: Duplicate object

2006-01-11 Thread Eric Walstad
On Wednesday 11 January 2006 19:00, Mike wrote: > Hi, > > How do I duplicate objects? Save_as style. > > Something like... > a=choices.get_list()[0] > a.poll_id = 2 > a.save() > > Except not to update the previous record > > Thanks, > Mike The following approach ('shallow' copy) has worked well

Re: Duplicate object

2006-01-11 Thread Adrian Holovaty
On 1/11/06, Dody Suria Wijaya <[EMAIL PROTECTED]> wrote: > I think you meant __dict__, instead of __dict__(). Secondly, that would > still copy the primary key, which turns into update query on save. > An easy way to duplicate in choice case would be to set id as False > value (0, None, or False),

Re: Duplicate object

2006-01-11 Thread Dody Suria Wijaya
I think you meant __dict__, instead of __dict__(). Secondly, that would still copy the primary key, which turns into update query on save. An easy way to duplicate in choice case would be to set id as False value (0, None, or False), and thus force the saving routine to generate a new one and

Re: Duplicate object

2006-01-11 Thread Jeremy Dunck
On 1/11/06, Adrian Holovaty <[EMAIL PROTECTED]> wrote: > a = choices.get_list()[0] > a.poll_id = 2 > new_choice = choices.Choice(**a.__dict__()) > new_choice.save() Oh, uh, wouldn't you not want to assign poll_id at all on the new obj? Assigning =2 could overlay an existing id r

Re: Duplicate object

2006-01-11 Thread Mike
Thanks Adrian. Very Useful. Mike

Re: Duplicate object

2006-01-11 Thread Jeremy Dunck
On 1/11/06, Adrian Holovaty <[EMAIL PROTECTED]> wrote: > For example the following two statements are > functionally identical: > > write_love_letter(to='python', from='adrian') > write_love_letter(**{'to': 'python', 'from': 'adrian'}) But we'd need unicode to do this: ;-) write_love_let

Re: Duplicate object

2006-01-11 Thread Adrian Holovaty
On 1/11/06, Mike <[EMAIL PROTECTED]> wrote: > How do I duplicate objects? Save_as style. > > Something like... > a=choices.get_list()[0] > a.poll_id = 2 > a.save() > > Except not to update the previous record You could do this: a = choices.get_list()[0] a.poll_id = 2 new_choice = cho

Duplicate object

2006-01-11 Thread Mike
Hi, How do I duplicate objects? Save_as style. Something like... a=choices.get_list()[0] a.poll_id = 2 a.save() Except not to update the previous record Thanks, Mike