I have been having a weird thing happen when using pyodbc with mssql.
The create table statements work, but none of the INSERT statements
do. For example, when I run code and echo, I get this:
(TCSData) F:\Inetpub\TCSData\src\tcsdata-dist\tcsdatapysmvt broadcast
initapp
calling:
Hello,
I am running Sqlite/SQLA/Elixir on an embedded system. I have different
classes with OneToMany relationships and I have configured cascade = all,
delete, delete-orphan correctly on them. However, for this to work, I
should do something like :
parentlist = session.query(Parent).all()
for
On 28 mai, 17:54, Michael Bayer mike...@zzzcomputing.com wrote:
sbard wrote:
hello,
i've got a database with one table per country.
for example :
part_es (for spain)
part_uk ()
part_it
each tables have got the same schema
(id, libel, description, part_numer)
Thanks, Michael!
That was exactly what I'm looking for :)
Andi
On Thu, May 28, 2009 at 5:45 PM, Michael Bayer mike...@zzzcomputing.com wrote:
Andi Albrecht wrote:
class Foo(object):
pass
database.mapper(Foo, database.metadata.tables[footable])
Now I have a rather complex SQL that I
One solution is to change the commit strategy; issue commits periodically
during the loop.
parentlist = session.query(Parent).all()
count = 0
for parent in parentlist:
session.delete(parent)
count += 1
if count % 100 == 0 # use whatever frequency is needed
count = 0
Another solution is to use triggers in SQLite to enforce FK
relationships.
http://www.sqlite.org/cvstrac/wiki?p=ForeignKeyTriggers
http://code.google.com/p/sqlitefktg4sa/
On May 29, 6:59 am, Mike Conley mconl...@gmail.com wrote:
One solution is to change the commit strategy; issue commits
Randy,
Interesting approach to foreign key management. Harish indicates he is
having a problem with restricted memory. Won't that still be true with
triggers? After all, if the problem is that the transaction is too big, it
will still be too big with all the pending deletes executed in a trigger.
Mike,
Well...I am not sure. I thought SQLite held transaction details in
a .journal file and not in memory. I thought that the memory use
might actually be a Python problem and not a result of SQLite. If my
thoughts are correct, using the FK approach should keep deleting the
children in
Harish said it was an embedded system, probably all resources are pretty
severely restricted.
--
Mike Conley
On Fri, May 29, 2009 at 9:44 AM, Randy Syring ra...@rcs-comp.com wrote:
Mike,
Well...I am not sure. I thought SQLite held transaction details in
a .journal file and not in
There is nothing special about SQLAlchemy's handling of many parameters
versus a few, and it's likely a limiation of the IN clause as implemented
on those backends. IN does not support arbitrarily large numbers of
parameters. On Oracle for example the limit is 1000. If you need to do
a large
Ah...missed that part. Well, the triggers are fired per-row I
believe. So, if you don't use a transaction explicitly, and just let
SQLite run without one, it will be slower, but the resource usage
should be much less.
On May 29, 10:31 am, Mike Conley mconl...@gmail.com wrote:
Harish said it
Jonathan Marshall wrote:
How is the sqlalchemy's outptut of parameters generated? It has the
wrong value in it.
there is a loop on line 227 of sqlalchemy/engine/default.py.
I find it interesting that 99,997 values are correct on a 100,000
parameter query and 50,000 parameters works
you need to also specify the desired columns:
return DeclarativeMeta(%sMyBase%tablename, (BaseTest,), {
'__tablename__':tablename,
'id':Column('id', Integer, primary_key=True),
'stuff':Column('stuff', String(20))
})
sbard wrote:
Gregg Lind wrote:
I use declarative base for defining classes.
I have a constraint that only works in Postgres. How do I declare
that constraint lowername_check only if the session is going
postgres (and not to sqlite, for example).
pg_only_constraint = CheckConstraint(lowername !~
As always, thank you for the complete, exhaustive answer. This
particular thing is definitely and edge case, and rather non-obvious,
so thank you for walking me through it.
Either of those are clean enough for me!
Is there are more proper / general way to describe the problem, so
google and
Jonathan Marshall wrote:
How is the sqlalchemy's outptut of parameters generated? It has the
wrong value in it.
hold that thought, I've reproduced the bug.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
I need to compare database schemas (tables, column names, types,
constraints, etc.). Is there anything out there that uses SQLAlchemy
metadata to compare database schemas? or should I start working on one?
--
Mike Conley
--~--~-~--~~~---~--~~
You received this
On Friday 29 May 2009 20:53:04 Mike Conley wrote:
I need to compare database schemas (tables, column names, types,
constraints, etc.). Is there anything out there that uses
SQLAlchemy metadata to compare database schemas? or should I start
working on one?
here mine just for a diff:
Michael Bayer wrote:
Jonathan Marshall wrote:
How is the sqlalchemy's outptut of parameters generated? It has the
wrong value in it.
hold that thought, I've reproduced the bug.
its a hash collision resulting from a hopefully undocumented usage of the
params dict you send to execute,
Michael Bayer wrote:
Michael Bayer wrote:
Jonathan Marshall wrote:
How is the sqlalchemy's outptut of parameters generated? It has the
wrong value in it.
hold that thought, I've reproduced the bug.
its a hash collision resulting from a hopefully undocumented usage of the
params dict
Gregg Lind wrote:
I used the DDL style
DDL('''ALTER TABLE data ADD CONSTRAINT lowername_check CHECK
(lowername !~ '[[\:upper\:]]')''',
on=postgres).execute_at('after-create',Data.__table__)
and now my print_schema method (based on
Alas, that doesn't seem to matter or help.
Even this statement causes the same issue. Odd. Must not be related
to the colons, alas.
DDL(r''' ''', on=postgres).execute_at('after-create',Data.__table__)
On Fri, May 29, 2009 at 3:08 PM, Michael Bayer mike...@zzzcomputing.com wrote:
Gregg
Gregg Lind wrote:
Alas, that doesn't seem to matter or help.
Even this statement causes the same issue. Odd. Must not be related
to the colons, alas.
DDL(r''' ''', on=postgres).execute_at('after-create',Data.__table__)
didnt realize you're printing with mock. its:
buf.write(str(s) +
mock is on the way out as a general use tool.
Gregg Lind wrote:
You got me there! Updating the FAQ on it would fix the issue for others.
For reference:
##
from sqlalchemy import *
from sqlalchemy.schema import DDL
from sqlalchemy.ext.declarative import
24 matches
Mail list logo