amazing!  I think this might actually work...  thanks Mike, will keep the 
thread updated with a working solution when I have one

On Thursday, 14 February 2019 18:16:27 UTC, Harry wrote:
>
> Hello SQLAlchemy friends,  I'm struggling to get sqlalchemy to do what I 
> want.
>
> I'd like to use "classical mappings" so that my business objects are free 
> from any direct dependency on sqlalchemy.
>
> I'm representing two different objects that are very similar:  customer 
> orders and warehouse stock.  Both are essentially a collection of lines 
> mapping product codes ("sku") to quantities.
>
> here's some not-quite-python pseudocode
>
>
> Objective 1:  Order and Warehouse class have a dict attribute representing 
> skus and quantities
>
> class Order:
>     lines: dict # {sku: qty}
>
>
> class Warehouse:
>     lines: dict # {sku: qty}
>
>
> my basic problem here is I can't figure out how to get sqlalchemy to 
> populate a dict of raw values (not mapped objects) from a join/related 
> table.
>
>
>
> Objective 2 (backup plan)  Order and Stock class have an attribute .lines 
> which is a list of "Line" objects
>
> class Line:
>     sku: str
>     qty: int
>
>
> class Order:
>     lines: List[Line]
>
>
> class Warehouse:
>     lines: List[Line]
>
>
>
>
> my problem here is sqlalchemy really doesn't like a single class (Line) 
> being effectively mapped to two tables.
>
>
>
> tables look something like this (actual code)
>
>
> order = Table(
>     'order', metadata,
>     Column('id', Integer, primary_key=True, autoincrement=True),
> )
> order_lines = Table(
>     'order_lines', metadata,
>     Column('order_id', ForeignKey('order.id'), primary_key=True),
>     Column('sku', String(255), primary_key=True),
>     Column('qty', Integer),
> )
> warehouse = Table(
>     'warehouse', metadata,
>     Column('id', Integer, primary_key=True, autoincrement=True),
> )
> warehouse_lines = Table(
>     'warehouse_lines', metadata,
>     Column('warehouse_id', ForeignKey('warehouse.id'), primary_key=True),
>     Column('sku', String(255), primary_key=True),
>     Column('qty', Integer),
> )
>
>
>
>
> but configuring the mapper is beyond me.  I never got anything remotely 
> close to working for option 1.
>
> and for option 2, having the ".lines" attribute mapped as a relationship 
> but using the same Line class, doesn't seem to work at all:
>
>
> mapper(domain_model.Line, order_lines)
> mapper(domain_model.Order, order, properties={
>     'lines': relationship(domain_model.Line, cascade="all, delete-orphan")
> })
> warehouse_line_mapper = mapper(domain_model.Line, warehouse_lines, 
> non_primary=True)
> mapper(domain_model.Warehouse, warehouse, properties={
>     'lines': relationship(warehouse_line_mapper, cascade="all, 
> delete-orphan")
> })
>
>
>
> If anyone can help, I'd love to see if I can get option 1 working, where 
> sqlalchemy can build dicts for the Order and Warehouse classes using the 
> order_lines and warehouse_lines tables, without needing a class to 
> represent those lines.  alternatively, the backup plan using a single Line 
> class for both tables would be a consolation prize.  but i'd really like to 
> avoid being force to c creating two different classes to represent the 
> lines for order and warehouse just to make my orm happy.
>
> gist with stripped-down example code here: 
> https://gist.github.com/hjwp/09fd282062e934eeb2a46a40945e48c8
>
>
> thanks in advance!
>

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to