Hello.

I managed to solve my problem. I use session.execute(...) instead of
session.delete(), because the second form deletes more things than it should.
The working code is:

    # Move some client products from a duplicate to the original.
    # Remove duplicate clients afterwards (in cascade).
    #
    # Note that client_map is a dict from a duplicate to its original.
    for each_duplicate, each_client in client_map.iteritems():
        for each_cp in each_duplicate.client_products:
            if some_condition(each_cp):
                each_cp.client = each_client
    session.flush()
    table = Client.__table__
    duplicate_ids = [each.id for each in duplicate_clients.iterkeys()]
    q = table.delete().where(table.c.id.in_(duplicate_ids))
    session.execute(q)

I still want to know whether this is expected behaviour and if so why:
 1. ClientProduct has FK to Client. Client has relationship client_products with
ON DELETE CASCADE. Thus if a client is deleted, all its client_products are
deleted too.
 2. Suppose I have a client with two client_products.
 3. I move one of them to a different client: client.client_products[0].client =
other_client.
 4. I delete the client: session.delete(client)
 5. session.deleted now contains TWO ClientProduct instances instead of ONE even
though I moved one of them to a completely different client.
 6. Why?! Can I do anything to prevent this, e.g. insert call to
session.flush(), session.expunge(), session.refresh() or some such somewhere?
All my attempts with session.flush() failed (had no effect).


Thank you,

Ladislav Lenart


On 8.11.2012 17:30, Ladislav Lenart wrote:
> Hello.
> 
> I have a client which has a collection of ClientProduct-s (ClientProduct has a
> FK to Client). The following code:
> 
>     # Move some client products from a duplicate to the original.
>     # Remove duplicate clients afterwards (in cascade).
>     #
>     # Note that client_map is a dict from a duplicate to its original.
>     for each_duplicate, each_client in client_map.iteritems():
>         for each_cp in each_duplicate.client_products:
>             if some_condition(each_cp):
>                 each_cp.client = each_client
>         session.delete(each_duplicate)
>     session.flush()
> 
> deletes a client product that was moved from each_duplicate to each_client in
> the inner loop. Why? What can I do to prevent it?
> 
> 
> Thank you in advance,
> 
> Ladislav Lenart

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to