Yes, I am very glad to be free of multiversion and bitemporal
concerns, although I will eventually be setting this up for partial
multimaster asynchronous replication (but with a lot of intentional
compromises in order to avoid the major, currently unsolved, problems
with that entire field).

As for different NodeTypes and polymorphic associations to values, I
am indeed taking part in something similar, but I do not allow the
polymorphism to exist on the Python side of the code | database
distinction, nor do I even allow the polymorphism to even enter the
ORM layer in mappers.  All of my polymorphism is strictly contained
within the database using a 'vertical tables' approach.  All of that
dbcook stuff scares me, though I think I can see why you want it.  I
will eventually need to wade into the waters of stricter type checking
and conversion on my tables, and that will get me into a lot of
similar concerns but hopefully not as deeply!

Also, yes, there are definitely a lot of 80/20 concerns and
willingness go for creative and seemingly partial solutions when
appropriate.

I found this wiki article very interesting (though I had already
reinvented most of what it refers to before I ever knew about it):

http://en.wikipedia.org/wiki/Entity-attribute-value_model

It was encouraging to invent a wheel, later to find that the design
decisions I made aligned nicely with an entire existing field of
wheels.  In this case I don't consider the work of reinvention to be
wasted, because this project has been my vehicle for learning Python,
sqlalchemy and much more.

Wish I could offer to work with you, but this is my spare-time
project, I don't even have a coding job yet.  I'm supporting myself
with manual labor while rolling this out.  I think my project will at
least get me a decent job once it's far enough into beta stage to show
to anyone; or possibly it could become my own business next year.

Eric


On Nov 25, 1:04 pm, [EMAIL PROTECTED] wrote:
> > Thanks for your reply.  I have been following your posts with
> > interest over the past half year (or I thought even longer).  At
> > first I thought you were crazy.  But now I've found myself creating
> > a model of similar complexity, as necessary to express the domain
> > I'm working on.
>
> i think you're better than me, as u dont have the burden of
> bitemporal/multiversion stuff, and the completely different NodeTypes
> (hence polymorphic-assoc to values). seems your domain is better
> mathematicalizable - or u havent hit the edge of the door yet.
>
> having less types is also a relief, u dont need a dbcook layer to
> automate/hide mappings and value-type-related stuff (right now in a
> prototype scenario i have say 100 tables, 20 m2m assoc, 220 foreign
> keys - finally will probably be 200/100/600 or similar... it's
> unthinkable to handle them one by one, esp. if most of team has never
> seen sql).
>
> 1st make sure your model is correct. then, this full_heritage() func
> (mine is called similar, order_of_inheritance()) has the trouble of
> firing too many queries itself. once u start counting the queries
> fired around your current model, then u'll realize that such
> step-by-step recursive stuff - which is nice and pretty ok for
> c/python/lisp/whatever - isn't going to be very fast... but this also
> depends on the size of searchables. yours are simple and maybe 1000
> queries arent going to be too bad. in my case, 1 awful complex
> bitemporal query is 100x faster than 1000 smaller bitemporal ones.
> i found it might be better in certain cases to get more data (2x) than
> needed and filter it further in python, than go crazy expressing the
> filter in flat set arithmetics (what sql is, more or less). (small)
> compromises are not a bad thing if placed properly.
>
> beware, my initial code about this graph-data-inheritance was the size
> of yours, while currently it's like 5x bigger - but can work either
> way (oh well 95% there).
>
> the moral i got from all this is, if the whole wardrobe wont get
> through the door, there are a lot of seemingly partial solutions that
> at the end may do better than initial requirement - and should NOT be
> ignored - but are notoriously difficult to guess as the initial
> target obvious "solution" obscures them. be it turning upside down,
> disassembling / cutting (the wardrobe or the door.. or another door),
> replacing with 3 small cupboards... swapping rooms... burning it and
> using plain hangers off the walls... moving elsewhere...
>
> have fun, and eeer... i'm sorta looking for job.
> too bad this _very_ "interesting times" project doesnt pay bills well.
> svilwww.svilendobrev.com
>
> > The purpose of my model is to "ingest" all of the easily
> > expressible facts about the characteristics ('features') of
> > categories and (recursive) subcategories of items within specific
> > sub-regions of a domain of physical items, and to encode and store
> > those facts in a minimally redundant and maximally searchable /
> > indexable form.  This supports an "instant search" / "search while
> > you type" widget that is far aware of the conceptual structure
> > within the domain being searched, unlike a mere full-text search
> > over a flatfile, or ordinary (non-category-structure-aware)
> > indexings of a flatfile.  This awareness should bring significant
> > benefits in terms of reducing the search to its minimal consistent
> > combinations of targets and a sense of "just bringing up exactly
> > what the user was looking for".
>
> > In the weeks ahead I will revisit some of the threads you listed.
> > Thank you for the conclusions and suggestions you mentioned; they
> > seem reasonable.
>
> > Eric
>
> > On Nov 25, 2:42 am, [EMAIL PROTECTED] wrote:
> > > Eric
> > > i'm not sure i follow all your mapping setup as it's too detail.
> > > but:
>
> > > i've been battling along similar data/feature inheritance+shading
> > > stuff along a branchy, recursive directed graph (not a pure tree
> > > as it has alternative short-cut paths in it), all over bitemporal
> > > objects and values (i.e. m2m relations all over, + grouping by
> > > time/objid), for almost half an year. see these posts of mine: -
> > > Many to many self referential relationship /15.03
> > >  - multiple-aspects revisited /23.06
> > >  - tree/graph persistency, concurency-safe? 13.07
> > >  - unions in query?
> > >  - and probably most others
> > >  - as well as this thread in [EMAIL PROTECTED]:
> > >     "optimizing a query over tree-like structure", 2008-09-30
> > >     my setup (law,company, department(s)/recursive, position,
> > > workplace, employment) is all explained there, less the
> > > bitemporalism.
>
> > > also see the thread "Is there a simple way to let records have
> > > the same groups as it parents", or just look up "data
> > > inheritance" in the group.
>
> > > and so far i've reached these decisions, based on all the
> > > experiments (i don't come from sql background, and
> > > OO/practicalism doesnt give much insights on what sql can
> > > handle):
> > >  - the root-most branches are separate queries, and a pseudo
> > > multi-query mimes a plain one over all those (it can also be a
> > > union of all ot them, or one single query - but single one has
> > > 20+tables in the From, and union fails here-there). it also came
> > > that different root-most branches have slightly diff. meaning
> > > hence it's good if they are loaded separately.
> > >  - recursion is handled by expanding it on say 3 levels deep,
> > > hoping that noone will go further (i.e. a.x or a.a and (a.a.x or
> > > a.a.a and (a.a.a.x or ...))) etc.
> > >  - everything is generated by a node-type-walking on class level,
> > > and the strategy of alternativ'ing on each level can be different
> > > (i.e. it can start as multiquery, follow as union on branch A and
> > > as single query on branch B). i can give this code if anyone
> > > dares read it.. - the query returns all values whereever
> > > reachable/associated with some end-node
> > >  - actual inheritance/shading etc is done after that in python.
> > > it can (probably) be done at sql-level by a very specific
> > > order-by, but it's nightmarish bitemporal query already so no
> > > need to go hell any further
>
> > > the times i got are of this kind: 10 nodes, with 10 values each,
> > > x100 changes each, for about 20sec, on a relatively slow machine
> > > / postgres.
>
> > > maybe we can work together to get something out of it.
>
> > > On Tuesday 25 November 2008 09:51:37 Eric Ongerth wrote:
> > > > Below, I have attached a working testcase.  It works, yes --
> > > > but my question is that I need to make an improved version of a
> > > > particular method on one of my classes.  The following model
> > > > will probably explain itself for the most part.  I'll let you
> > > > read it first, then offer a few explanatory notes afterward
> > > > just in case.  Finally, at the end, I will describe the
> > > > difference between what the method in question does now, and
> > > > what I would like it to do.
>
> > > > The nature of the response I am seeking is: a description of
> > > > what I need to do to build a better version of the method I'm
> > > > speaking of, including any further insight on the practice of
> > > > joining at multiple levels of a recursive / self-referential
> > > > (but loop-free) graph.
>
> > > > ---snip---
>
> > > > from sqlalchemy import *
> > > > from sqlalchemy.sql import *
> > > > from sqlalchemy.orm import *
>
> > > > engine = create_engine('sqlite://')
>
> > > > metadata = MetaData(bind=engine)
>
> > > > itemtypes = Table('itemtypes', metadata,
> > > >     Column('name', Text, primary_key=True))
>
> > > > itemtype_inheritance = Table('itemtype_inheritance', metadata,
> > > >     Column('itemtype_name', Text, ForeignKey('itemtypes.name'),
> > > > primary_key=True),
> > > >     Column('parent_name', Text, ForeignKey('itemtypes.name'),
> > > > primary_key=True))
>
> > > > features = Table('features', metadata,
> > > >     Column('id', Integer, primary_key=True),
> > > >     Column('name', Text),
> > > >     Column('root_itemtype_name', Text,
> > > > ForeignKey('itemtypes.name')))
>
> > > > feature_dependencies = Table('feature_dependencies', metadata,
> > > >     Column('dependent_id', Integer, ForeignKey('features.id'),
> > > > primary_key=True),
> > > >     Column('determinant_id', Integer,
> > > > ForeignKey('features.id'), primary_key=True))
>
> > > > metadata.drop_all()
> > > > metadata.create_all()
>
> > > > itemtypes.insert().execute([
> > > >     {'name': 'Product'},
> > > >     {'name': 'Footwear'},
> > > >     {'name': 'Boot'},
> > > >     {'name': 'Ski'}
> > > > ])
>
> > > > itemtype_inheritance.insert().execute([
> > > >     {'itemtype_name': 'Footwear', 'parent_name': 'Product'},
> > > >     {'itemtype_name': 'Boot', 'parent_name': 'Footwear'},
> > > >     {'itemtype_name': 'Ski', 'parent_name': 'Product'}
> > > > ])
>
> > > > features.insert().execute([
> > > >     {'id': 1, 'name': 'Manufacturer',
> > > > 'root_itemtype_name':'Product' },
> > > >     {'id': 2, 'name': 'Model', 'root_itemtype_name':'Product'
> > > > }, {'id': 3, 'name': 'Year', 'root_itemtype_name':'Product' },
> > > > {'id': 4, 'name': 'Gender', 'root_itemtype_name':'Footwear' },
> > > > {'id': 5, 'name': 'US Shoe Size',
> > > > 'root_itemtype_name':'Footwear' },
> > > >     {'id': 6, 'name': 'Length', 'root_itemtype_name':'Ski' },
> > > >     {'id': 7, 'name': 'Weight', 'root_itemtype_name':'Product'
> > > > }
>
> > > > ])
>
> > > > feature_dependencies.insert().execute([
> > > >     {'dependent_id': 7, 'determinant_id': 1},
> > > >     {'dependent_id': 7, 'determinant_id': 2},
> > > >     {'dependent_id': 7, 'determinant_id': 3},
> > > >     {'dependent_id': 7, 'determinant_id': 4},
> > > >     {'dependent_id': 7, 'determinant_id': 5},
> > > >     {'dependent_id': 7, 'determinant_id': 6}
> > > > ])
>
> > > > class Itemtype(object):
>
> ...
>
> read more »
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to