[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-03-05 Thread phrrn...@googlemail.com

The problem with the connection being returned to the pool was due to
executing the SET IDENTITY_INSERT statement on the *cursor* rather
than the *connection*. The documentation states that the connection
will be returned to the pool when a statement is executed on it that
doesn't return any results (such as the SET statement). using
self.connection.execute solved that problem.

There was a difficult to diagnose problem with python-sybase in that
sometimes the keys of the parameter dict were in unicode which caused
the buf.name = name assignment to throw a TypeError. Coercing the
param keys via str() solved that problem.

In Sybase, a column declaration without NULL/NOT NULL defaults to NOT
NULL so a bunch of the tests need to be updated.

The schema introspection stuff seems to work OK albeit with some low-
level querying of the system tables. I started off with a higher-level
implementation but abandoned it due to all kinds of (possibly
spurious) problems. There are still some problems with the test tear-
downs as tables are not being dropped in the correct order.

All in all, the driver is now in a state that can be called
'buggy' (as opposed to being completely dysfunctional)

pjjH


On Feb 27, 4:29 pm, phrrn...@googlemail.com
phrrn...@googlemail.com wrote:
 Yes, it is based off the mssql code but I made some modifications to
 it to take care of situations like, for example, where there is an
 identity column but it is not the primary key (and hence not a
 'sequence'). This means a read off the catalog to find the identity
 column (I believe that only one identity column is permitted per
 table). I was wondering if some 'bad thing' happens if you execute a
 select on the cursor and retrieve results when you are in the
 pre_exec.

 I don't know what you are referring to when you say 'throw a pdb' .. I
 hope it has something to do with the debugger!

 As for being in deep, I am afraid we are only starting: Sybase has
 enough 'special' stuff to keep us busy for a long time e.g. cross-
 database referential integrity constraints. database-specific default
 schemas (e.g. login foo may have schema 'dbo' in database apple  but
 schema 'guest' in database pear and schema 'prod' in database banana).
 Then what does one do about remote objects mapped in via CIS (e.g.
 REMOTE.production.dbo.very_important_table) (actually this is a
 problem with SQL Server also)

 pjjH

 On Feb 27, 4:05 pm, Michael Bayer mike...@zzzcomputing.com wrote:

  phrrn...@googlemail.com wrote:

   I want to automatically set IDENTITY_INSERT for a table if the
   identity column is explicitly listed. Likewise, after execution of an
   insert on a table with an identity column we want to retrieve the
   identity value.

   Any idea why the following code would cause the connection to be
   checked in between the pre_exec() and the actual execution of the
   statement?  I have enabled high levels of debugging on the python-
   sybase driver and can see that a new connection is made *after* the
   'SET IDENTITY_INSERT foo ON'  and the actual command runs on that new
   connection (and, of course, fails as IDENTITY_INSERT is not enabled on
   that table for the new connection).

  Assuming you took that code from the MSSQL dialect, it should be fine.
  that looks like an older version of it, though...in 0.6 take a look in
  mssql/base.py for the latest version of the IDENTITY_INSERT dance.

  but no there's nothing in there in any case that would cause a second
  connection to be checked out. throw a pdb into the Connection constructor,
  or perhaps in pool.connect(), to track where that's coming from.

  sorry you're in deep  :)

   pjjH

   class SybaseSQLExecutionContext(default.DefaultExecutionContext):
   def _table_identity_column(self, t):
   Return the name of the this table's identity column
   # negative caching
   if not hasattr(t, '_identity_column'):
   t._identity_column = None
   s = rSELECT cols.name FROM syscolumns as cols JOIN
   sysobjects as o ON (cols.id = o.id) WHERE o.name ='%s' and cols.status
0x80 = 0x80 % (t.name)
   self.cursor.execute(s)
   r = self.cursor.fetchone()
   if r:
   t._identity_column  = r[0]
   return t._identity_column

   def pre_exec(self):
   self.HAS_IDENTITY = False
   self.IDENTITY_INSERT = False
   # What about UPDATE statements? Is this even possible in
   Sybase?
   if self.compiled.isinsert:
   if self._table_identity_column
   (self.compiled.statement.table):
   self.HAS_IDENTITY = True
   identity_column = self._table_identity_column
   (self.compiled.statement.table)
   if identity_column in self.compiled_parameters[0].keys
   ():
   self.IDENTITY_INSERT = True
   self.cursor.execute(SET IDENTITY_INSERT %s ON %

   

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread phrrn...@googlemail.com

How does one deal with driver-specific unit tests? I am running in
difficulties in testing the pyodbc and python-sybase drivers for the
sybase dialect. For example, test_raw_qmark works  with the pyodbc
driver (as it supports that style) but not with the python-sybase
driver. Is there some decorator available that can help with skipping
certain tests for a given DBABI driver. Any suggestions on how to
handle this?

pjjH


On Feb 26, 5:31 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 we have ticket 785 for this:

 http://www.sqlalchemy.org/trac/ticket/785

 On Feb 26, 2009, at 4:45 PM, phrrn...@googlemail.com wrote:



  Thanks Michael. I have a sybase.py passing *some* unit tests with both
  pyodbc and the Sybase driver, both running on Solaris 10 x86 against
  ASE 15. This is a hack that seems to work for the Sybase DBAPI module.
  I do have access to lots and lots of different Sybase stuff so I will
  start from your patched version and reintegrate my schema
  introspection and other stuff. Do you have a ticket open for the
  sybase driver yet? Where should I send the patches?

  pjjH

 def do_execute(self, cursor, statement, parameters, context=None,
  **kwargs):
 if self.paramstyle == 'named':
 #prepend the arguments with an '@'
 hacked_args = dict((@+n, v) for n,v in parameters.items
  ())
 super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
  statement, hacked_args, context=context, **kwargs)
 else:
 super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
  statement, parameters, context=context, **kwargs)

 def create_connect_args(self, url):
 opts = url.translate_connect_args()
 opts.update(url.query)

 self.autocommit = False
 if 'autocommit' in opts:
 self.autocommit = bool(int(opts.pop('autocommit')))

 dictArgs = {
 'datetime': 'python',# Stop the annoying
  diagnostics from the module
 'auto_commit' : self.autocommit, # the named argument is
  called 'auto_commit' rather than 'autocommit'
 }

 if 'database' in opts:
 dictArgs['database'] = opts['database']

 return ([opts['host'], opts['username'], opts['password']],
  dictArgs)

  On Feb 26, 4:30 pm, Michael Bayer mike...@zzzcomputing.com wrote:
  On Feb 26, 2009, at 3:55 PM, phrrn...@googlemail.com wrote:

  I am doing some work on a SA engine for Sybase Adaptive Server
  Enterprise (ASE) on top of both pyodbc and the Sybase DB-API driver.
  The existing sybase engine for SA only works with Sybase Anywhere
  (ASA).

  that is correct ; I've recently had to take a look at this driver and
  realized that it was not really written for Sybase at all, and the
  original author is whereabouts unknown.   To that end I would like it
  to be replaced with an actual Sybase driver.

  There is a problem with named parameters with the Sybase driver in
  that the placeholders are prepended with an '@' *and* the execute
  method expects any dict paramers to have have keys that also have an
  '@'. I was able  to get the placeholders generated correctly by
  subclassing the compiler. Any suggestions on how to get the execute
  method to work nicely or do I have to do some much around with
  copying
  parameters or monkeypatching the Sybase module with an
  implementation
  of execute that will work with 'ordinary' dictionaries?

  the attached patch, which represents my partial progress, addresses
  this.  Unfortuantely I was not able to continue since I was
  developing
  from a Mac to a development server, and it turns out that connecting
  with the Sybase driver using FreeTDS renders bind parameters
  inoperable.  After several days of attempting to get the developer
  edition of sybase ASE running in a virtual linux environment
  (apparently only works on older versions of ubuntu/fedora, but even
  after installing those, I was unsuccessful), I gave up.

  If you have access to a working Sybase ASE environment, you can have
  full reign over the sybase.py dialect - anything specific to SQL
  Anywhere can be removed, since its an obsolete product and if it were
  supported, it would be in its own dialect. The Sybase driver may
  be targeted towards the 0.6 release of SQLAlchemy.  Version 0.6 is
  oriented around a dialect refactor and schema expression refactor
  (there are no ORM changes) and would be a much better place to start
  building out new drivers - there are some significant differences in
  how dialects are constructed between 0.5 versus 0.6.

  sybase.patch
  12KViewDownload
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com
For more options, visit this group at 

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread Michael Bayer

phrrn...@googlemail.com wrote:

 How does one deal with driver-specific unit tests? I am running in
 difficulties in testing the pyodbc and python-sybase drivers for the
 sybase dialect. For example, test_raw_qmark works  with the pyodbc
 driver (as it supports that style) but not with the python-sybase
 driver. Is there some decorator available that can help with skipping
 certain tests for a given DBABI driver. Any suggestions on how to
 handle this?


most tests make usage of decorators like @testing.fails_on to mark various
databases as unsupported.  That test in particular is very specific to
certain DBAPIs, i.e. those that support qmark bind parameters.  For the
lesser databases like MSSQL and Firebird, i.e. those which have lots of
missing features, hundreds of decorators are configured to exclude them.  
You would have a similar task in the case of sybase.

But to be specific regarding pyodbc vs. python-sybase, that is exactly
what's addressed in SQLA 0.6.  If you look there you'll see the decorators
can differentiate among multiple DBAPIs for the same dialect, i.e.
sybase+pyodbc vs. sybase+python-sybase in this case.  There is also a
coherent non-guesswork system of using specific drivers.

just so you know we'd really like SQLA 0.6 to be released soon after
pycon.   There's not that much work to be done on it for a release.  The
only reason its a major number is because the API for dialects does
change considerably.





 pjjH


 On Feb 26, 5:31 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 we have ticket 785 for this:

 http://www.sqlalchemy.org/trac/ticket/785

 On Feb 26, 2009, at 4:45 PM, phrrn...@googlemail.com wrote:



  Thanks Michael. I have a sybase.py passing *some* unit tests with both
  pyodbc and the Sybase driver, both running on Solaris 10 x86 against
  ASE 15. This is a hack that seems to work for the Sybase DBAPI module.
  I do have access to lots and lots of different Sybase stuff so I will
  start from your patched version and reintegrate my schema
  introspection and other stuff. Do you have a ticket open for the
  sybase driver yet? Where should I send the patches?

  pjjH

 def do_execute(self, cursor, statement, parameters, context=None,
  **kwargs):
 if self.paramstyle == 'named':
 #prepend the arguments with an '@'
 hacked_args = dict((@+n, v) for n,v in parameters.items
  ())
 super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
  statement, hacked_args, context=context, **kwargs)
 else:
 super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
  statement, parameters, context=context, **kwargs)

 def create_connect_args(self, url):
 opts = url.translate_connect_args()
 opts.update(url.query)

 self.autocommit = False
 if 'autocommit' in opts:
 self.autocommit = bool(int(opts.pop('autocommit')))

 dictArgs = {
 'datetime': 'python',# Stop the annoying
  diagnostics from the module
 'auto_commit' : self.autocommit, # the named argument is
  called 'auto_commit' rather than 'autocommit'
 }

 if 'database' in opts:
 dictArgs['database'] = opts['database']

 return ([opts['host'], opts['username'], opts['password']],
  dictArgs)

  On Feb 26, 4:30 pm, Michael Bayer mike...@zzzcomputing.com wrote:
  On Feb 26, 2009, at 3:55 PM, phrrn...@googlemail.com wrote:

  I am doing some work on a SA engine for Sybase Adaptive Server
  Enterprise (ASE) on top of both pyodbc and the Sybase DB-API driver.
  The existing sybase engine for SA only works with Sybase Anywhere
  (ASA).

  that is correct ; I've recently had to take a look at this driver and
  realized that it was not really written for Sybase at all, and the
  original author is whereabouts unknown.   To that end I would like it
  to be replaced with an actual Sybase driver.

  There is a problem with named parameters with the Sybase driver in
  that the placeholders are prepended with an '@' *and* the execute
  method expects any dict paramers to have have keys that also have an
  '@'. I was able  to get the placeholders generated correctly by
  subclassing the compiler. Any suggestions on how to get the execute
  method to work nicely or do I have to do some much around with
  copying
  parameters or monkeypatching the Sybase module with an
  implementation
  of execute that will work with 'ordinary' dictionaries?

  the attached patch, which represents my partial progress, addresses
  this.  Unfortuantely I was not able to continue since I was
  developing
  from a Mac to a development server, and it turns out that connecting
  with the Sybase driver using FreeTDS renders bind parameters
  inoperable.  After several days of attempting to get the developer
  edition of sybase ASE running in a virtual linux environment
  (apparently only works on older versions of ubuntu/fedora, but even
  after installing those, I was 

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread phrrn...@googlemail.com

I want to automatically set IDENTITY_INSERT for a table if the
identity column is explicitly listed. Likewise, after execution of an
insert on a table with an identity column we want to retrieve the
identity value.

Any idea why the following code would cause the connection to be
checked in between the pre_exec() and the actual execution of the
statement?  I have enabled high levels of debugging on the python-
sybase driver and can see that a new connection is made *after* the
'SET IDENTITY_INSERT foo ON'  and the actual command runs on that new
connection (and, of course, fails as IDENTITY_INSERT is not enabled on
that table for the new connection).

pjjH


class SybaseSQLExecutionContext(default.DefaultExecutionContext):
def _table_identity_column(self, t):
Return the name of the this table's identity column
# negative caching
if not hasattr(t, '_identity_column'):
t._identity_column = None
s = rSELECT cols.name FROM syscolumns as cols JOIN
sysobjects as o ON (cols.id = o.id) WHERE o.name ='%s' and cols.status
 0x80 = 0x80 % (t.name)
self.cursor.execute(s)
r = self.cursor.fetchone()
if r:
t._identity_column  = r[0]
return t._identity_column

def pre_exec(self):
self.HAS_IDENTITY = False
self.IDENTITY_INSERT = False
# What about UPDATE statements? Is this even possible in
Sybase?
if self.compiled.isinsert:
if self._table_identity_column
(self.compiled.statement.table):
self.HAS_IDENTITY = True
identity_column = self._table_identity_column
(self.compiled.statement.table)
if identity_column in self.compiled_parameters[0].keys
():
self.IDENTITY_INSERT = True
self.cursor.execute(SET IDENTITY_INSERT %s ON %
 
self.dialect.identifier_preparer.format_table
(self.compiled.statement.table))

def post_exec(self):
if self.HAS_IDENTITY:
self.cursor.execute(SELECT @@identity AS lastrowid)
lastrowid = self.cursor.fetchone()[0]
if lastrowid  0:
if not hasattr(self, '_last_inserted_ids') or
self._last_inserted_ids is None:
self._last_inserted_ids = [lastrowid]
else:
self._last_inserted_ids = [lastrowid] +
self._last_inserted_ids[1:]

if self.IDENTITY_INSERT:
self.cursor.execute(SET IDENTITY_INSERT %s OFF %
self.dialect.identifier_preparer.format_table
(self.compiled.statement.table))


On Feb 27, 2:02 pm, phrrn...@googlemail.com
phrrn...@googlemail.com wrote:
 OK. I will do the development work against the 0.6 tree. I may end up
 backporting it to 0.5 as I want to get in into use at work as soon as
 is reasonable (which may be prior to the 0.6 release)

 pjjH

 On Feb 27, 11:29 am, Michael Bayer mike...@zzzcomputing.com wrote:

  phrrn...@googlemail.com wrote:

   How does one deal with driver-specific unit tests? I am running in
   difficulties in testing the pyodbc and python-sybase drivers for the
   sybase dialect. For example, test_raw_qmark works  with the pyodbc
   driver (as it supports that style) but not with the python-sybase
   driver. Is there some decorator available that can help with skipping
   certain tests for a given DBABI driver. Any suggestions on how to
   handle this?

  most tests make usage of decorators like @testing.fails_on to mark various
  databases as unsupported.  That test in particular is very specific to
  certain DBAPIs, i.e. those that support qmark bind parameters.  For the
  lesser databases like MSSQL and Firebird, i.e. those which have lots of
  missing features, hundreds of decorators are configured to exclude them.
  You would have a similar task in the case of sybase.

  But to be specific regarding pyodbc vs. python-sybase, that is exactly
  what's addressed in SQLA 0.6.  If you look there you'll see the decorators
  can differentiate among multiple DBAPIs for the same dialect, i.e.
  sybase+pyodbc vs. sybase+python-sybase in this case.  There is also a
  coherent non-guesswork system of using specific drivers.

  just so you know we'd really like SQLA 0.6 to be released soon after
  pycon.   There's not that much work to be done on it for a release.  The
  only reason its a major number is because the API for dialects does
  change considerably.

   pjjH

   On Feb 26, 5:31 pm, Michael Bayer mike...@zzzcomputing.com wrote:
   we have ticket 785 for this:

  http://www.sqlalchemy.org/trac/ticket/785

   On Feb 26, 2009, at 4:45 PM, phrrn...@googlemail.com wrote:

Thanks Michael. I have a sybase.py passing *some* unit tests with both
pyodbc and the Sybase driver, both running on Solaris 10 x86 against
ASE 15. This is a hack that seems to work for the Sybase DBAPI module.
I do have access to lots and lots of different Sybase stuff so I will
start from your patched 

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread Michael Bayer

phrrn...@googlemail.com wrote:

 I want to automatically set IDENTITY_INSERT for a table if the
 identity column is explicitly listed. Likewise, after execution of an
 insert on a table with an identity column we want to retrieve the
 identity value.

 Any idea why the following code would cause the connection to be
 checked in between the pre_exec() and the actual execution of the
 statement?  I have enabled high levels of debugging on the python-
 sybase driver and can see that a new connection is made *after* the
 'SET IDENTITY_INSERT foo ON'  and the actual command runs on that new
 connection (and, of course, fails as IDENTITY_INSERT is not enabled on
 that table for the new connection).

Assuming you took that code from the MSSQL dialect, it should be fine. 
that looks like an older version of it, though...in 0.6 take a look in
mssql/base.py for the latest version of the IDENTITY_INSERT dance.

but no there's nothing in there in any case that would cause a second
connection to be checked out. throw a pdb into the Connection constructor,
or perhaps in pool.connect(), to track where that's coming from.

sorry you're in deep  :)




 pjjH


 class SybaseSQLExecutionContext(default.DefaultExecutionContext):
 def _table_identity_column(self, t):
 Return the name of the this table's identity column
 # negative caching
 if not hasattr(t, '_identity_column'):
 t._identity_column = None
 s = rSELECT cols.name FROM syscolumns as cols JOIN
 sysobjects as o ON (cols.id = o.id) WHERE o.name ='%s' and cols.status
  0x80 = 0x80 % (t.name)
 self.cursor.execute(s)
 r = self.cursor.fetchone()
 if r:
 t._identity_column  = r[0]
 return t._identity_column

 def pre_exec(self):
 self.HAS_IDENTITY = False
 self.IDENTITY_INSERT = False
 # What about UPDATE statements? Is this even possible in
 Sybase?
 if self.compiled.isinsert:
 if self._table_identity_column
 (self.compiled.statement.table):
 self.HAS_IDENTITY = True
 identity_column = self._table_identity_column
 (self.compiled.statement.table)
 if identity_column in self.compiled_parameters[0].keys
 ():
 self.IDENTITY_INSERT = True
 self.cursor.execute(SET IDENTITY_INSERT %s ON %

 self.dialect.identifier_preparer.format_table
 (self.compiled.statement.table))

 def post_exec(self):
 if self.HAS_IDENTITY:
 self.cursor.execute(SELECT @@identity AS lastrowid)
 lastrowid = self.cursor.fetchone()[0]
 if lastrowid  0:
 if not hasattr(self, '_last_inserted_ids') or
 self._last_inserted_ids is None:
 self._last_inserted_ids = [lastrowid]
 else:
 self._last_inserted_ids = [lastrowid] +
 self._last_inserted_ids[1:]

 if self.IDENTITY_INSERT:
 self.cursor.execute(SET IDENTITY_INSERT %s OFF %
 self.dialect.identifier_preparer.format_table
 (self.compiled.statement.table))


 On Feb 27, 2:02 pm, phrrn...@googlemail.com
 phrrn...@googlemail.com wrote:
 OK. I will do the development work against the 0.6 tree. I may end up
 backporting it to 0.5 as I want to get in into use at work as soon as
 is reasonable (which may be prior to the 0.6 release)

 pjjH

 On Feb 27, 11:29 am, Michael Bayer mike...@zzzcomputing.com wrote:

  phrrn...@googlemail.com wrote:

   How does one deal with driver-specific unit tests? I am running in
   difficulties in testing the pyodbc and python-sybase drivers for the
   sybase dialect. For example, test_raw_qmark works  with the pyodbc
   driver (as it supports that style) but not with the python-sybase
   driver. Is there some decorator available that can help with
 skipping
   certain tests for a given DBABI driver. Any suggestions on how to
   handle this?

  most tests make usage of decorators like @testing.fails_on to mark
 various
  databases as unsupported.  That test in particular is very specific to
  certain DBAPIs, i.e. those that support qmark bind parameters.  For
 the
  lesser databases like MSSQL and Firebird, i.e. those which have lots
 of
  missing features, hundreds of decorators are configured to exclude
 them.
  You would have a similar task in the case of sybase.

  But to be specific regarding pyodbc vs. python-sybase, that is exactly
  what's addressed in SQLA 0.6.  If you look there you'll see the
 decorators
  can differentiate among multiple DBAPIs for the same dialect, i.e.
  sybase+pyodbc vs. sybase+python-sybase in this case.  There is also a
  coherent non-guesswork system of using specific drivers.

  just so you know we'd really like SQLA 0.6 to be released soon after
  pycon.   There's not that much work to be done on it for a release.
 The
  only reason its a major number is because the API for dialects does
  change 

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread phrrn...@googlemail.com

Yes, it is based off the mssql code but I made some modifications to
it to take care of situations like, for example, where there is an
identity column but it is not the primary key (and hence not a
'sequence'). This means a read off the catalog to find the identity
column (I believe that only one identity column is permitted per
table). I was wondering if some 'bad thing' happens if you execute a
select on the cursor and retrieve results when you are in the
pre_exec.

I don't know what you are referring to when you say 'throw a pdb' .. I
hope it has something to do with the debugger!

As for being in deep, I am afraid we are only starting: Sybase has
enough 'special' stuff to keep us busy for a long time e.g. cross-
database referential integrity constraints. database-specific default
schemas (e.g. login foo may have schema 'dbo' in database apple  but
schema 'guest' in database pear and schema 'prod' in database banana).
Then what does one do about remote objects mapped in via CIS (e.g.
REMOTE.production.dbo.very_important_table) (actually this is a
problem with SQL Server also)

pjjH


On Feb 27, 4:05 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 phrrn...@googlemail.com wrote:

  I want to automatically set IDENTITY_INSERT for a table if the
  identity column is explicitly listed. Likewise, after execution of an
  insert on a table with an identity column we want to retrieve the
  identity value.

  Any idea why the following code would cause the connection to be
  checked in between the pre_exec() and the actual execution of the
  statement?  I have enabled high levels of debugging on the python-
  sybase driver and can see that a new connection is made *after* the
  'SET IDENTITY_INSERT foo ON'  and the actual command runs on that new
  connection (and, of course, fails as IDENTITY_INSERT is not enabled on
  that table for the new connection).

 Assuming you took that code from the MSSQL dialect, it should be fine.
 that looks like an older version of it, though...in 0.6 take a look in
 mssql/base.py for the latest version of the IDENTITY_INSERT dance.

 but no there's nothing in there in any case that would cause a second
 connection to be checked out. throw a pdb into the Connection constructor,
 or perhaps in pool.connect(), to track where that's coming from.

 sorry you're in deep  :)



  pjjH

  class SybaseSQLExecutionContext(default.DefaultExecutionContext):
  def _table_identity_column(self, t):
  Return the name of the this table's identity column
  # negative caching
  if not hasattr(t, '_identity_column'):
  t._identity_column = None
  s = rSELECT cols.name FROM syscolumns as cols JOIN
  sysobjects as o ON (cols.id = o.id) WHERE o.name ='%s' and cols.status
   0x80 = 0x80 % (t.name)
  self.cursor.execute(s)
  r = self.cursor.fetchone()
  if r:
  t._identity_column  = r[0]
  return t._identity_column

  def pre_exec(self):
  self.HAS_IDENTITY = False
  self.IDENTITY_INSERT = False
  # What about UPDATE statements? Is this even possible in
  Sybase?
  if self.compiled.isinsert:
  if self._table_identity_column
  (self.compiled.statement.table):
  self.HAS_IDENTITY = True
  identity_column = self._table_identity_column
  (self.compiled.statement.table)
  if identity_column in self.compiled_parameters[0].keys
  ():
  self.IDENTITY_INSERT = True
  self.cursor.execute(SET IDENTITY_INSERT %s ON %

  self.dialect.identifier_preparer.format_table
  (self.compiled.statement.table))

  def post_exec(self):
  if self.HAS_IDENTITY:
  self.cursor.execute(SELECT @@identity AS lastrowid)
  lastrowid = self.cursor.fetchone()[0]
  if lastrowid  0:
  if not hasattr(self, '_last_inserted_ids') or
  self._last_inserted_ids is None:
  self._last_inserted_ids = [lastrowid]
  else:
  self._last_inserted_ids = [lastrowid] +
  self._last_inserted_ids[1:]

  if self.IDENTITY_INSERT:
  self.cursor.execute(SET IDENTITY_INSERT %s OFF %
  self.dialect.identifier_preparer.format_table
  (self.compiled.statement.table))

  On Feb 27, 2:02 pm, phrrn...@googlemail.com
  phrrn...@googlemail.com wrote:
  OK. I will do the development work against the 0.6 tree. I may end up
  backporting it to 0.5 as I want to get in into use at work as soon as
  is reasonable (which may be prior to the 0.6 release)

  pjjH

  On Feb 27, 11:29 am, Michael Bayer mike...@zzzcomputing.com wrote:

   phrrn...@googlemail.com wrote:

How does one deal with driver-specific unit tests? I am running in
difficulties in testing the pyodbc and python-sybase drivers for the
sybase dialect. For example, test_raw_qmark works  with the pyodbc

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-27 Thread Michael Bayer

phrrn...@googlemail.com wrote:

 Yes, it is based off the mssql code but I made some modifications to
 it to take care of situations like, for example, where there is an
 identity column but it is not the primary key (and hence not a
 'sequence'). This means a read off the catalog to find the identity
 column (I believe that only one identity column is permitted per
 table). I was wondering if some 'bad thing' happens if you execute a
 select on the cursor and retrieve results when you are in the
 pre_exec.

 I don't know what you are referring to when you say 'throw a pdb' .. I
 hope it has something to do with the debugger!

 As for being in deep, I am afraid we are only starting: Sybase has
 enough 'special' stuff to keep us busy for a long time e.g. cross-
 database referential integrity constraints. database-specific default
 schemas (e.g. login foo may have schema 'dbo' in database apple  but
 schema 'guest' in database pear and schema 'prod' in database banana).
 Then what does one do about remote objects mapped in via CIS (e.g.
 REMOTE.production.dbo.very_important_table) (actually this is a
 problem with SQL Server also)

OK yeah there's a lot of stuff, but to just have a dialect we should
focus first on the basics - that an INSERT works, etc.  all of our
dialects leave a lot of database-specific functionality uncovered.  Cross
DB constraints in particular is something we already support for most
systems, since from our end its just a prefix on a table name.

by throw a pdb i mean add pdb.set_trace() to the code to set a
breakpoint.



 pjjH


 On Feb 27, 4:05 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 phrrn...@googlemail.com wrote:

  I want to automatically set IDENTITY_INSERT for a table if the
  identity column is explicitly listed. Likewise, after execution of an
  insert on a table with an identity column we want to retrieve the
  identity value.

  Any idea why the following code would cause the connection to be
  checked in between the pre_exec() and the actual execution of the
  statement?  I have enabled high levels of debugging on the python-
  sybase driver and can see that a new connection is made *after* the
  'SET IDENTITY_INSERT foo ON'  and the actual command runs on that new
  connection (and, of course, fails as IDENTITY_INSERT is not enabled on
  that table for the new connection).

 Assuming you took that code from the MSSQL dialect, it should be fine.
 that looks like an older version of it, though...in 0.6 take a look in
 mssql/base.py for the latest version of the IDENTITY_INSERT dance.

 but no there's nothing in there in any case that would cause a second
 connection to be checked out. throw a pdb into the Connection
 constructor,
 or perhaps in pool.connect(), to track where that's coming from.

 sorry you're in deep  :)



  pjjH

  class SybaseSQLExecutionContext(default.DefaultExecutionContext):
  def _table_identity_column(self, t):
  Return the name of the this table's identity column
  # negative caching
  if not hasattr(t, '_identity_column'):
  t._identity_column = None
  s = rSELECT cols.name FROM syscolumns as cols JOIN
  sysobjects as o ON (cols.id = o.id) WHERE o.name ='%s' and cols.status
   0x80 = 0x80 % (t.name)
  self.cursor.execute(s)
  r = self.cursor.fetchone()
  if r:
  t._identity_column  = r[0]
  return t._identity_column

  def pre_exec(self):
  self.HAS_IDENTITY = False
  self.IDENTITY_INSERT = False
  # What about UPDATE statements? Is this even possible in
  Sybase?
  if self.compiled.isinsert:
  if self._table_identity_column
  (self.compiled.statement.table):
  self.HAS_IDENTITY = True
  identity_column = self._table_identity_column
  (self.compiled.statement.table)
  if identity_column in self.compiled_parameters[0].keys
  ():
  self.IDENTITY_INSERT = True
  self.cursor.execute(SET IDENTITY_INSERT %s ON %

  self.dialect.identifier_preparer.format_table
  (self.compiled.statement.table))

  def post_exec(self):
  if self.HAS_IDENTITY:
  self.cursor.execute(SELECT @@identity AS lastrowid)
  lastrowid = self.cursor.fetchone()[0]
  if lastrowid  0:
  if not hasattr(self, '_last_inserted_ids') or
  self._last_inserted_ids is None:
  self._last_inserted_ids = [lastrowid]
  else:
  self._last_inserted_ids = [lastrowid] +
  self._last_inserted_ids[1:]

  if self.IDENTITY_INSERT:
  self.cursor.execute(SET IDENTITY_INSERT %s OFF %
  self.dialect.identifier_preparer.format_table
  (self.compiled.statement.table))

  On Feb 27, 2:02 pm, phrrn...@googlemail.com
  phrrn...@googlemail.com wrote:
  OK. I will do the development work against the 0.6 tree. 

[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-26 Thread Michael Bayer

On Feb 26, 2009, at 3:55 PM, phrrn...@googlemail.com wrote:


 I am doing some work on a SA engine for Sybase Adaptive Server
 Enterprise (ASE) on top of both pyodbc and the Sybase DB-API driver.
 The existing sybase engine for SA only works with Sybase Anywhere
 (ASA).

that is correct ; I've recently had to take a look at this driver and  
realized that it was not really written for Sybase at all, and the  
original author is whereabouts unknown.   To that end I would like it  
to be replaced with an actual Sybase driver.

 There is a problem with named parameters with the Sybase driver in
 that the placeholders are prepended with an '@' *and* the execute
 method expects any dict paramers to have have keys that also have an
 '@'. I was able  to get the placeholders generated correctly by
 subclassing the compiler. Any suggestions on how to get the execute
 method to work nicely or do I have to do some much around with copying
 parameters or monkeypatching the Sybase module with an implementation
 of execute that will work with 'ordinary' dictionaries?

the attached patch, which represents my partial progress, addresses  
this.  Unfortuantely I was not able to continue since I was developing  
from a Mac to a development server, and it turns out that connecting  
with the Sybase driver using FreeTDS renders bind parameters  
inoperable.  After several days of attempting to get the developer  
edition of sybase ASE running in a virtual linux environment  
(apparently only works on older versions of ubuntu/fedora, but even  
after installing those, I was unsuccessful), I gave up.

If you have access to a working Sybase ASE environment, you can have  
full reign over the sybase.py dialect - anything specific to SQL  
Anywhere can be removed, since its an obsolete product and if it were  
supported, it would be in its own dialect. The Sybase driver may  
be targeted towards the 0.6 release of SQLAlchemy.  Version 0.6 is  
oriented around a dialect refactor and schema expression refactor  
(there are no ORM changes) and would be a much better place to start  
building out new drivers - there are some significant differences in  
how dialects are constructed between 0.5 versus 0.6.




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~--~~~~--~~--~--~---



sybase.patch
Description: Binary data



[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-26 Thread phrrn...@googlemail.com

Thanks Michael. I have a sybase.py passing *some* unit tests with both
pyodbc and the Sybase driver, both running on Solaris 10 x86 against
ASE 15. This is a hack that seems to work for the Sybase DBAPI module.
I do have access to lots and lots of different Sybase stuff so I will
start from your patched version and reintegrate my schema
introspection and other stuff. Do you have a ticket open for the
sybase driver yet? Where should I send the patches?

pjjH

def do_execute(self, cursor, statement, parameters, context=None,
**kwargs):
if self.paramstyle == 'named':
#prepend the arguments with an '@'
hacked_args = dict((@+n, v) for n,v in parameters.items
())
super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
statement, hacked_args, context=context, **kwargs)
else:
super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
statement, parameters, context=context, **kwargs)

def create_connect_args(self, url):
opts = url.translate_connect_args()
opts.update(url.query)

self.autocommit = False
if 'autocommit' in opts:
self.autocommit = bool(int(opts.pop('autocommit')))

dictArgs = {
'datetime': 'python',# Stop the annoying
diagnostics from the module
'auto_commit' : self.autocommit, # the named argument is
called 'auto_commit' rather than 'autocommit'
}

if 'database' in opts:
dictArgs['database'] = opts['database']

return ([opts['host'], opts['username'], opts['password']],
dictArgs)


On Feb 26, 4:30 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 On Feb 26, 2009, at 3:55 PM, phrrn...@googlemail.com wrote:



  I am doing some work on a SA engine for Sybase Adaptive Server
  Enterprise (ASE) on top of both pyodbc and the Sybase DB-API driver.
  The existing sybase engine for SA only works with Sybase Anywhere
  (ASA).

 that is correct ; I've recently had to take a look at this driver and
 realized that it was not really written for Sybase at all, and the
 original author is whereabouts unknown.   To that end I would like it
 to be replaced with an actual Sybase driver.

  There is a problem with named parameters with the Sybase driver in
  that the placeholders are prepended with an '@' *and* the execute
  method expects any dict paramers to have have keys that also have an
  '@'. I was able  to get the placeholders generated correctly by
  subclassing the compiler. Any suggestions on how to get the execute
  method to work nicely or do I have to do some much around with copying
  parameters or monkeypatching the Sybase module with an implementation
  of execute that will work with 'ordinary' dictionaries?

 the attached patch, which represents my partial progress, addresses
 this.  Unfortuantely I was not able to continue since I was developing
 from a Mac to a development server, and it turns out that connecting
 with the Sybase driver using FreeTDS renders bind parameters
 inoperable.  After several days of attempting to get the developer
 edition of sybase ASE running in a virtual linux environment
 (apparently only works on older versions of ubuntu/fedora, but even
 after installing those, I was unsuccessful), I gave up.

 If you have access to a working Sybase ASE environment, you can have
 full reign over the sybase.py dialect - anything specific to SQL
 Anywhere can be removed, since its an obsolete product and if it were
 supported, it would be in its own dialect. The Sybase driver may
 be targeted towards the 0.6 release of SQLAlchemy.  Version 0.6 is
 oriented around a dialect refactor and schema expression refactor
 (there are no ORM changes) and would be a much better place to start
 building out new drivers - there are some significant differences in
 how dialects are constructed between 0.5 versus 0.6.

  sybase.patch
 12KViewDownload


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~--~~~~--~~--~--~---



[sqlalchemy] Re: Sybase DB-API driver uses @foo for placeholder names and expect parameter dict keys to be similarly named

2009-02-26 Thread Michael Bayer

we have ticket 785 for this:

http://www.sqlalchemy.org/trac/ticket/785


On Feb 26, 2009, at 4:45 PM, phrrn...@googlemail.com wrote:


 Thanks Michael. I have a sybase.py passing *some* unit tests with both
 pyodbc and the Sybase driver, both running on Solaris 10 x86 against
 ASE 15. This is a hack that seems to work for the Sybase DBAPI module.
 I do have access to lots and lots of different Sybase stuff so I will
 start from your patched version and reintegrate my schema
 introspection and other stuff. Do you have a ticket open for the
 sybase driver yet? Where should I send the patches?

 pjjH

def do_execute(self, cursor, statement, parameters, context=None,
 **kwargs):
if self.paramstyle == 'named':
#prepend the arguments with an '@'
hacked_args = dict((@+n, v) for n,v in parameters.items
 ())
super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
 statement, hacked_args, context=context, **kwargs)
else:
super(SybaseSQLDialect_Sybase, self).do_execute(cursor,
 statement, parameters, context=context, **kwargs)

def create_connect_args(self, url):
opts = url.translate_connect_args()
opts.update(url.query)

self.autocommit = False
if 'autocommit' in opts:
self.autocommit = bool(int(opts.pop('autocommit')))

dictArgs = {
'datetime': 'python',# Stop the annoying
 diagnostics from the module
'auto_commit' : self.autocommit, # the named argument is
 called 'auto_commit' rather than 'autocommit'
}

if 'database' in opts:
dictArgs['database'] = opts['database']

return ([opts['host'], opts['username'], opts['password']],
 dictArgs)


 On Feb 26, 4:30 pm, Michael Bayer mike...@zzzcomputing.com wrote:
 On Feb 26, 2009, at 3:55 PM, phrrn...@googlemail.com wrote:



 I am doing some work on a SA engine for Sybase Adaptive Server
 Enterprise (ASE) on top of both pyodbc and the Sybase DB-API driver.
 The existing sybase engine for SA only works with Sybase Anywhere
 (ASA).

 that is correct ; I've recently had to take a look at this driver and
 realized that it was not really written for Sybase at all, and the
 original author is whereabouts unknown.   To that end I would like it
 to be replaced with an actual Sybase driver.

 There is a problem with named parameters with the Sybase driver in
 that the placeholders are prepended with an '@' *and* the execute
 method expects any dict paramers to have have keys that also have an
 '@'. I was able  to get the placeholders generated correctly by
 subclassing the compiler. Any suggestions on how to get the execute
 method to work nicely or do I have to do some much around with  
 copying
 parameters or monkeypatching the Sybase module with an  
 implementation
 of execute that will work with 'ordinary' dictionaries?

 the attached patch, which represents my partial progress, addresses
 this.  Unfortuantely I was not able to continue since I was  
 developing
 from a Mac to a development server, and it turns out that connecting
 with the Sybase driver using FreeTDS renders bind parameters
 inoperable.  After several days of attempting to get the developer
 edition of sybase ASE running in a virtual linux environment
 (apparently only works on older versions of ubuntu/fedora, but even
 after installing those, I was unsuccessful), I gave up.

 If you have access to a working Sybase ASE environment, you can have
 full reign over the sybase.py dialect - anything specific to SQL
 Anywhere can be removed, since its an obsolete product and if it were
 supported, it would be in its own dialect. The Sybase driver may
 be targeted towards the 0.6 release of SQLAlchemy.  Version 0.6 is
 oriented around a dialect refactor and schema expression refactor
 (there are no ORM changes) and would be a much better place to start
 building out new drivers - there are some significant differences in
 how dialects are constructed between 0.5 versus 0.6.

 sybase.patch
 12KViewDownload


 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
sqlalchemy group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~--~~~~--~~--~--~---