e script, it'll work fine since the objects are cached.
After that, it'll hang on the last line.
justin
#!/usr/bin/python
from sqlobject import *
sqlhub.processConnection = connectionForURI(
"sqlite:///tmp/test.db?debug=true" )
class Container( SQLObject ):
name = StringCol()
Being able to specify an ID of BIGINT (bigserial in Postgres) would
be one less thing to worry about.
Justin
On 1 Apr 2006, at 08:17, Jeremy Fitzhardinge wrote:
Jaime Wyant wrote:
Currently I have about 1.7 million rows in there. SQLObject
created the `id' column for the table as
move columns and indexes be
present in the sqlapi layer? I'd suggest that it's needed so that we
can write tools that sit on top of SQLObject without having to write an
independant database abstraction library (repeating a lot of code no doubt).
Justin
Jonathan Ellis wrot
te and drop tables - just needs a bit of
fleshing out.
Cheers,
Justin
---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the
ch back and forth depending on which
database connection you landed on.
Using transactions for all reads will get around this problem, but the
root cause is that autocommit was getting turned off and you would get a
transaction without asking for one.
--
- Justin
ached patch in order for
transactions to work correctly. It was posted on the list but I'm not
sure if it was ever applied.
Without it, autocommit gets stuck turned off, and new transactions are
started without being requested.
--
- Justin
--- dbconnection.py 2005-09-29 02:22:20
o use a more
complicated query involving an AND() or something.
--
- Justin
---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems? Stop! Download the new AJAX search engine that makes
searching your log