[sqlalchemy] DB Redundancy

2009-05-05 Thread Vic
I'm looking for a way to have my DB replicated in REAL TIME to be used in case I lose my primary copy. I saw that the two phase commit exist but I'm not sure if that is the correct option. I have the feeling that it would be abusing a mechanism purposed for correlating to separate DBs and not cre

[sqlalchemy] Re: last_inserted_ids and ORM

2009-05-05 Thread Michael Bayer
Mike Conley wrote: > Does the idea of last_inserted_ids exist for ORM? > > I do > session.add(someobj) > session.commit() > and then want the id of the newly inserted object. > > I can reference > someobj.id > but this generates a select call to the database get the id before yo

[sqlalchemy] Re: One-to-many relation fails with "unsaved, pending instance and is an orphan"

2009-05-05 Thread Michael Bayer
Christoph Haas wrote: > > So my "Session.commit()" should do the database action and create one row > for the user and one row for the item. So why is there a problem with the > autoflushing? SQLAlchemy could save a new logbookentry to the database > referring via foreign keys to the user and item

[sqlalchemy] last_inserted_ids and ORM

2009-05-05 Thread Mike Conley
Does the idea of last_inserted_ids exist for ORM? I do session.add(someobj) session.commit() and then want the id of the newly inserted object. I can reference someobj.id but this generates a select call to the database If the insert was done with SQL syntax, something like:

[sqlalchemy] Re: One-to-many relation fails with "unsaved, pending instance and is an orphan"

2009-05-05 Thread Christoph Haas
Michael, thanks a lot for your reply. I haven't yet understood your explanation completely so please allow me to ask further. Am Montag, 4. Mai 2009 23:01:01 schrieb Michael Bayer: > the key to the problem is in the traceback: > > Traceback (most recent call last): > File "test.py", line 80, in

[sqlalchemy] Re: limit

2009-05-05 Thread Tiago Becker
Sorry for the mess, but now i have another problem ;-) s = select(columns=' * ', from_obj=' pessoa ', limit=1) or s = select(columns=' * ', from_obj=' pessoa ').limit(1) results in SELECT , * FROM pessoa LIMIT 1 but limit 1 is not valid in oracle... is it a sqlalchemy error? Or it shouldn

[sqlalchemy] Re: limit

2009-05-05 Thread Tiago Becker
Damn, my mistake, sorry, didn't see the [ ] :-) On Tue, May 5, 2009 at 3:15 PM, Tiago Becker wrote: > Thnx for the quick reply! :-) > > I dont think i got it.. > > the output of: > > s = select('select * from table ').offset(1).limit(1) > > is... > > SELECT s, e, l, c, t, , *, f, r, o, m, a, b

[sqlalchemy] Re: Logging

2009-05-05 Thread Marcin Krol
Michael Bayer wrote: > assuming it works for you as it does for me, figure out whats different > about this program versus yours. Got it working, thanks -- you may want to add to the documentation that logging has to be configured BEFORE create_engine() call or else it won't work... Thanks, m

[sqlalchemy] Re: limit

2009-05-05 Thread Tiago Becker
Thnx for the quick reply! :-) I dont think i got it.. the output of: s = select('select * from table ').offset(1).limit(1) is... SELECT s, e, l, c, t, , *, f, r, o, m, a, b LIMIT 1 OFFSET 1 ... Can you please explain what am i doing wrong? Thnx a lot! On Tue, May 5, 2009 at 2:55 PM, Mich

[sqlalchemy] Re: limit

2009-05-05 Thread Michael Bayer
see http://www.sqlalchemy.org/docs/05/sqlexpression.html#ordering-grouping-limiting-offset-ing . Tiago Becker wrote: > Hello. > > I'm trying to write some kind of framework web, but i would like to use > sqlalchemy, but i need to make a paged result, and every DB has a way to > limit the query.

Re: Logging (was: Re: [sqlalchemy] Re: Bad updates -- caching problem?)

2009-05-05 Thread Michael Bayer
try running this program: from sqlalchemy import create_engine import logging logging.basicConfig() logging.getLogger("sqlalchemy.engine").setLevel(logging.DEBUG) engine = create_engine('sqlite://') engine.execute("select 1").fetchall() assuming it works for you as it does for me, figure ou

[sqlalchemy] limit

2009-05-05 Thread Tiago Becker
Hello. I'm trying to write some kind of framework web, but i would like to use sqlalchemy, but i need to make a paged result, and every DB has a way to limit the query... Is there a way to do this in alchemy? Note: it's a query defined in xml, so i use pure sql (this part will just use the alchem

Logging (was: Re: [sqlalchemy] Re: Bad updates -- caching problem?)

2009-05-05 Thread Marcin Krol
Michael Bayer wrote: >> logging.basicConfig(filename=globalpath + os.sep + 'sql.log') >> logging.getLogger('sqlalchemy.engine').setLevel(logging.DEBUG) >> logging.getLogger('sqlalchemy.orm.unitofwork').setLevel(logging.DEBUG) > > the logging configuration is not working. You should see DEBUG lin

[sqlalchemy] Re: Bad updates -- caching problem?

2009-05-05 Thread Michael Bayer
Marcin Krol wrote: > > I have indeed turned that on and looked there for hints, but I haven't > found anything - e.g. I can see queries and parameters, but not values > returned by SQL: > > INFO:sqlalchemy.engine.base.Engine.0x...a98c:SELECT newhosts.id AS > newhosts_id, newhosts.ip AS newho > sts

[sqlalchemy] Re: Bad updates -- caching problem?

2009-05-05 Thread Michael Bayer
Marcin Krol wrote: > >> the Session itself >> should be closed out after each request (i.e. session.close()) > > Really? Nowhere in the docs I have read I should do that, really... here is a diagram illustrating the whole thing, with four paragraphs of discussion below: http://www.sqlalchemy.org

[sqlalchemy] Re: Bad updates -- caching problem?

2009-05-05 Thread Marcin Krol
Michael Bayer wrote: > is there any caching in use ? global variables ? the Session itself > should be closed out after each request (i.e. session.close()) so its not > involved in the equation I'm doing session.close() after each update now and yet the problem persists. Regards, mk --~--

[sqlalchemy] *Proper* way of handling sessions

2009-05-05 Thread Marcin Krol
Hello everyone, I have discovered I can get around the problem I described in 'Bad updates -- caching problem?' by creating a session at the beginning of processing each http request, like this: def handler(req): global httpses, session Ses=sessionmaker(bind=eng) session=Ses() The thi

[sqlalchemy] Re: Bad updates -- caching problem?

2009-05-05 Thread Marcin Krol
Hello Michael, Thanks for quick answer! Michael Bayer wrote: > Marcin Krol wrote: > > rsv = session.query(Reservation).filter(Reservation.id == >> int(rid)).first() >> >> rhost = >> session.query(Host).filter(Host.id.in_(rhostsel)).order_by(Host.ip).first() >> >> host = >> session.query(Host).f

[sqlalchemy] Re: MYSQL cascade constraint on column

2009-05-05 Thread Michael Bayer
See http://www.sqlalchemy.org/docs/05/metadata.html#on-update-and-on-delete as well as http://www.sqlalchemy.org/docs/05/reference/sqlalchemy/schema.html?highlight=foreignkey#sqlalchemy.schema.ForeignKey . mhearne808[insert-at-sign-here]gmail[insert-dot-here]com wrote: > > Hello - I am developi

[sqlalchemy] Bad updates -- caching problem?

2009-05-05 Thread Marcin Krol
Hello everyone, P.S. I create session in another module like this: Session=sessionmaker(bind=eng) session=Session() Then I import 'session' from that module in a main web app. P.P.S. Backend is Postgres 8.1 I have added part for updating objects in my (web) application and get very weird ef

[sqlalchemy] Re: Bad updates -- caching problem?

2009-05-05 Thread Michael Bayer
Marcin Krol wrote: rsv = session.query(Reservation).filter(Reservation.id == > int(rid)).first() > > rhost = > session.query(Host).filter(Host.id.in_(rhostsel)).order_by(Host.ip).first() > > host = > session.query(Host).filter(Host.id.in_(hostsel)).order_by(Host.ip).first() > these three queries

[sqlalchemy] Bad updates -- caching problem?

2009-05-05 Thread Marcin Krol
Hello everyone, I have added part for updating objects in my (web) application and get very weird effect: if I update the same object several times, say, e.g. add and delete some Hosts from Reservation.hosts (many to many relation), on subsequent reads I get either a new value or one of the o

[sqlalchemy] Re: insert and joined mappers

2009-05-05 Thread Alessandro Dentella
On Tue, May 05, 2009 at 06:01:27AM -0700, GHZ wrote: > > try : > > m = mapper(MyJoin, a_table.join(b_table), properties={ > 'a_id' : [Table_a.__table__.c.id, Table_b.__table__.c.a_id] > }) > > from: > http://www.sqlalchemy.org/docs/05/mappers.html#mapping-a-class-against-multiple-tables

[sqlalchemy] Re: insert and joined mappers

2009-05-05 Thread GHZ
try : m = mapper(MyJoin, a_table.join(b_table), properties={ 'a_id' : [Table_a.__table__.c.id, Table_b.__table__.c.a_id] }) from: http://www.sqlalchemy.org/docs/05/mappers.html#mapping-a-class-against-multiple-tables On May 5, 11:46 am, Alessandro Dentella wrote: > Hi, > >   how should

[sqlalchemy] insert and joined mappers

2009-05-05 Thread Alessandro Dentella
Hi, how should I configure a mapper that represents a join between two tables so that inserting a new object writes the foreign key between the two in the proper way? class Table_a(Base): __tablename__ = 'a' id = Column(Integer, primary_key=True) descri