hi.
can i hook somewhere around implicit associative collections
(secondary_table=...), so i can influence somehow the row being
inserted?
e.g. i have column disabled in the secondarytable, and it has to be
populated at runtime...
something like someX.items.append( y, disabled=True) meaning
Hi everyone!
Is there a way to pickle a Query restriction?
If I try this:
import cPickle
restr = (MyDataClass.intproperty==1)
cPickle.dumps(restr, 1)
I get a
cPickle.PicklingError: Can't pickle class
'sqlalchemy.orm.properties.ColumnComparator': attribute lookup
hi all,
i am building a multi threaded web server which do lot of
save and update to database. sqlalchemy version that i am using is
0.4.4 python version 2.5 and mysql server 4.1.
The problem is when i save or update and do flush then it is
not reflected in database immediately
OK you might have to use the DDL() construct instead, if Index doesn't
support func() yet.
On Dec 10, 2008, at 2:19 AM, jose wrote:
I tried it, as you suggested me, Michael...
Index('valuta_desc_uniq', func.lower(valuta.c.descrizione),
unique=True)
File
On Tue, 2008-12-09 at 12:38 -0500, Michael Bayer wrote:
On Dec 9, 2008, at 5:08 AM, Julien Cigar wrote:
My question is: do I need to explicitely specify the join condition
when
inheritance is involved ? Why is SQLAlchemy able to detect the join
condition for my Content (line
On Dec 10, 2008, at 6:56 AM, chingi wrote:
hi all,
i am building a multi threaded web server which do lot of
save and update to database. sqlalchemy version that i am using is
0.4.4 python version 2.5 and mysql server 4.1.
The problem is when i save or update and do flush
Michael Bayer ha scritto:
There's a new extension in 0.5 which allows pickling of any ORM
structure called sqlalchemy.ext.serializer.
I see, thanks :)
Although the error below shouldn't occur in any case, try out the latest
trunk.
Do you mean normal Pickle shouldn't raise the
Hi,
What this message means?
self.save_objects(trans, task)
File /usr/lib/python2.4/site-packages/sqlalchemy/orm/unitofwork.py, line
1023, in save_objects
task.mapper.save_obj(task.polymorphic_tosave_objects, trans)
File /usr/lib/python2.4/site-packages/sqlalchemy/orm/mapper.py,
Hi,
I dusted off a project that had been dormant for a few months.
Upgrading to sqlalchemy .5 broke some code where I was inserting
computed values directly into a rowproxy object, before I passed the
rows to a template.
I'm getting 'RowProxy' object has no attribute 'excerpt'
Here is a very
this is because RowProxy is using __slots__ now. which ironically is
also for efficiency reasons :) .there are ways to build mutable
wrapper objects which are pretty efficient in a case like this, such
as a named tuple. my own minimal version of that looks like:
def
mssql through pyodbc (and unixODBC and FreeTDS on debian) connections
are broken for me when I pass a url to create engine. I can see that
databases/mssql.py:make_connect_string has been changed to express
host and port as ''server=host,port from
server=host;port=port when the driver attribute is
This has been bandied back and forth for months, and I think it's becoming
clear that having sqla map dburl's to ODBC connection strings is a losing
battle. Yet another connection argument is not sounding very attractive to
me.
Perhaps a simple reductionist policy for ODBC connections would be
Hi,
Im getting a (OperationalError) Could not decode to UTF-8 column. Ok,
I know how to fix this to prevent it, but I have this on a database
that i need intact and I cant query on the row to make an object to
manipulate, delete etc. Any advice? If I could just delete the record,
that would
Thanks for not only explaining the problem, but also suggesting a
solution. I'll give it a shot.
-
On Dec 10, 10:58 am, Michael Bayer [EMAIL PROTECTED] wrote:
this is because RowProxy is using __slots__ now. which ironically is
also for efficiency reasons :) .
In a transaction I am updating values, then querying, then updating
some more, ... then eventually committing the whole batch at once.
If a value being updated is a dictionary to be pickled then I may,
depending on the data in the dictionary, find that it is flushed to
the database before every
Hi there,
is there some more efficient way for dictifying a resultset other than
lst = list()
for row in session.query(...).all():
d = self.__dict__.copy()
for k in d.keys():
if k.startswith('_sa'):
del d[k]
lst.append(d)
Especially the loop of the keys
On Dec 10, 2008, at 2:24 PM, Jonathan Marshall wrote:
I think the problem may be that for some dictionaries that are to
pickled PickleType.copy_value returns something that does not equal
the original value according to PickleType.compare_values.
E.g.
from sqlalchemy.types import
On Dec 10, 2008, at 2:27 PM, Andreas Jung wrote:
Hi there,
is there some more efficient way for dictifying a resultset other than
lst = list()
for row in session.query(...).all():
d = self.__dict__.copy()
for k in d.keys():
if k.startswith('_sa'):
del d[k]
On Dec 10, 1:27 pm, Rick Morrison [EMAIL PROTECTED] wrote:
This has been bandied back and forth for months, and I think it's becoming
clear that having sqla map dburl's to ODBC connection strings is a losing
battle. Yet another connection argument is not sounding very attractive to
me.
On Dec 10, 2008, at 2:49 PM, desmaj wrote:
I dislike the idea of relying heavily on DSNs since I don't want SA to
tell people how to manage their systems. Giving full support to DSN-
less connections let's SA work with existing systems.
DSNs would eliminate these issues for SQLAlchemy.
On Wed, Dec 10, 2008 at 1:49 PM, desmaj [EMAIL PROTECTED] wrote:
On Dec 10, 1:27 pm, Rick Morrison [EMAIL PROTECTED] wrote:
This has been bandied back and forth for months, and I think it's becoming
clear that having sqla map dburl's to ODBC connection strings is a losing
battle. Yet another
On Wed, Dec 10, 2008 at 2:05 PM, Michael Bayer [EMAIL PROTECTED] wrote:
On Dec 10, 2008, at 2:49 PM, desmaj wrote:
I dislike the idea of relying heavily on DSNs since I don't want SA to
tell people how to manage their systems. Giving full support to DSN-
less connections let's SA work
It seems what when using .first() with from_statement() I get a
traceback with 0.5rc4 when no rows are found (works when a row is fount):
using 0.4.8 I get:
/usr
0.4.8
with 0.5rc4 I get:
/users/dgardner/dev
0.5.0rc4
Traceback (most recent call last):
File assetdb_test.py, line 38, in module
On Dec 10, 3:05 pm, Michael Bayer [EMAIL PROTECTED] wrote:
this is just my 2c, im not here to say how it should be done or not.
I would think that the standard SQLA host/port connect pattern should
work as well if we just are aware of what kind of client library we're
talking to. If we
Thanks, I'm not sure its a good thing that I found new ways to use SA
that were unintended by the author :).
In this case the raw sql query should return only a single row.
Michael Bayer wrote:
I dont think this was ever known behavior in 0.4 and its a little
strange that it workedit works
I'm using MSSQL + pyodbc + unixODBC + FreeTDS ...
I have been tracking the 0.5 line for a while now, but I only recently
noticed that I am unable to insert unicode into the database since
0.5rc2. Starting with 0.5rc3, when I do try to insert unicode into a
column defined as Unicode or
Whats the status of 0.5, is DSN the default in trunk now ?
DSN is the first choice in MSSQLDialect_pyodbc.make_connect_string
right now.
That's not what I see. I just pulled the 0.5 trunk, which I haven't been
tracking lately. Still uses the 'dsn' keyword build a connection string with
DSN is the one keyword for host that is universally recognized as
part of odbc proper, so it makes sense that host/port would be the
exception case - especially considering it seems like we now have to
pick among many formats for propagating host/port and are going to
require some kind
On Wed, Dec 10, 2008 at 3:42 PM, Rick Morrison [EMAIL PROTECTED] wrote:
Whats the status of 0.5, is DSN the default in trunk now ?
DSN is the first choice in MSSQLDialect_pyodbc.make_connect_string
right now.
That's not what I see. I just pulled the 0.5 trunk, which I haven't been
if you'd like to submit a patch which defines __visit_name__ for all
ClauseElements and removes the logic from VisitableType to guess the
name, it will be accepted. The second half of VisitableType still may
be needed since it improves performance.
Ok, I did it. Can not find where I
Here is my take, keeping in mind i havent used a windows machine in
about a year:
On Dec 10, 2008, at 5:13 PM, Lukasz Szybalski wrote:
cnxn = pyodbc.connect(DSN=dsnname;UID=user;PWD=password)
mssql://user:[EMAIL PROTECTED]/
or
cnxn = pyodbc.connect('DRIVER={SQL
Something like this:
As of 0.5 for pyodbc connections:
a) If the keyword argument 'odbc_connect' is given, it is assumed to be a
full ODBC connection string, which is used for the connection (perhaps we
can include a facility for Python sting interpolation into this string from
the dburi
mssql://user:[EMAIL PROTECTED]/database?
connect_type=TDS7other=argsthat=areneeded=foo
using connect_type, or some better name, we can map the URL scheme
to an unlimited number of vendor specific connect strings on the back.
Yeah, it's exactly that kind of mapping that has so far been a
hey send it as an email attachment, or create a ticket in trac as
guest/guest and attach it there: http://www.sqlalchemy.org/trac/newticket
On Dec 10, 2008, at 5:19 PM, Angri wrote:
if you'd like to submit a patch which defines __visit_name__ for all
ClauseElements and removes the
On Dec 10, 2008, at 5:21 PM, Rick Morrison wrote:
Something like this:
As of 0.5 for pyodbc connections:
a) If the keyword argument 'odbc_connect' is given, it is assumed to
be a full ODBC connection string, which is used for the connection
(perhaps we can include a facility for
Hi,
I am porting our pylons app from 0.4.5 to 0.5.
It appears that implicit queries are using a new connection instead of
re-using the existing one that is also used by the ORM Session.
I have read the Migration Docs and looked at the changelog and didn't
find anything related to this matter.
On Dec 10, 2008, at 5:39 PM, Rick Morrison wrote:
Its really not a big deal to maintain if
we just make a modular URLHandler class that given a sqlalchemy.URL
and a DBAPI, returns a connection. Anytime someone has some new
goofy string they need, they can provide one of these, we add
I've been wondering about this for a while, but for some reason I
didn't get that the ColumnProperty would let me query:
query(units).filter_by(depth=2581)
I'm not sure whether that's just me, or if the documentation isn't
clear.
Thanks!
-Channing
38 matches
Mail list logo