[EMAIL PROTECTED]: >> Sounds like a valid approach. Is anyone aware of a problem? I only >> remember that there were strange bugs if you don't set it up with each >> connection. Please go ahead and file a ticket for it. > > I'll file a ticket with the patch. We applied a patch yesterday for > this to our local copy and its been humming along all day.
Since the effect of SET TIME ZONE is reverted when the transaction aborts, you need to COMMIT after setting it. Regarding SET TRANSACTION ISOLATION LEVEL, this only works for the current transaction and thus needs to be repeated. Of course, it wouldn't be necessary at all if the default isolation level in the database server has been set up to READ COMMITTED ... > [ Regarding "transaction isolation level AUTOCOMMIT" ] > Not in postgresql, but in psycopg2 it is. You can set the isolation > level to 3 different settings, autocommit, read committed, and > serializable. > Currently django's psycopg2 adaptor is using read > committed, which is also the default psycopg2 setting. It's also > postgresql's default setting. The differene between autocommit and > read committed for psycopg2 is that in read committed mode every > cursor created in psycopg2 does: BEGIN; SET TRANSACTION ISOLATION > LEVEL READ COMMITTED > whenever the first execute on the cursor happens, > and then it does an END/COMMIT/ROLLBACK when the cursor goes out of > scope, gets committed or gets rolledback. You won't see this in > django's query log (you won't see set timezone either afaik), but you > will see it in the query logs if you turn those on in postgresql. In > autocommit mode psycopg2 does not do these transaction boundaries for > you. Unless you need transaction boundaries (sometimes you do) this > is pure overhead. So AUTOCOMMIT means "don't create BEGIN/COMMIT" statements, which proably leads to an autocommit mode, since without BEGIN, each statement will immediately cause a COMMIT. But isn't it strange to subsume it under the isolation levels? Well, never mind. > I also have a patch that switches psycopg2 to autocommit mode, which > is essentially the same read committed mode on the postgresql side, > but psycopg2 won't do these transaction boundaries for you. The patch > has been running in production for us very well for a day so far. > Database load is less than a third of what it was with the SET > TIMEZONE and teh read committed mode of psycopg2. I don't know. Django can do transaction management (see e.g. the transaction middleware), and to do this something needs to issue BEGIN TRANSACTION statents. Django seems not to do this, so I guess it relies on the psycopg backend. I doubt that this would work in Autocommit mode. If you want to digg this up, try to use the transaction middleware with Autocommit mode. I'm a bit cautious with the "works fine for me" argument in these cases. This all might be no problem for *you*. If you don't use any transaction management, your default isolation level is READ COMMITTED, anyway, and the default timezone is the same as specified in Django settings, then you don't need any statements to set them at all. If you want to make sure, test it with transaction management, set the default database isolaton level to SERIALIZABLE and the TIMEZONE to something far away. Then make "problematic" queries and see if it still works ... An optimization that could find out if the default settings of the database server are alright and then skip the settings would solve the problems better, IMO. > Perhaps this is a bug in psycopg2 or an undocumented feature. I've > copied Federico on this message so he can chime in. Let's see :-) >>> Also, modelobj.save() seems to do an existence test (SELECT 1 FROM...) >>> followed by an UPDATE or an INSERT. This will fail even in READ >>> COMMITTED isolation mode in some cases, as another transaction can >>> complete during the call to SELECT that creates the row. This is >>> probably an edge case, Seehttp://www.postgresql.org/docs/8.1/static/ >>> sql-update.html for an example of how to do this correctly. >> This case isn't treated properly in the generic views, but how you treat >> it is ultimately your own responsibility. You can simply re-evaluate the >> validation when you get an error from the database. I guess this is a >> web developer attitude ... users are used to resend their post in case >> of strange error messages from the server ;-) People coming from the GUI >> are used to treat such collisions more carefully since the average GUI >> client doesn't have a refresh button. > > I'll file a bug for it then. Even if it's a wishlist type item, it > probably shouldn't be forgotten about. Fine! > >>> And finally, it would be really nice if django's DatabaseWrapper >>> passed cursor parameters to the underlying engine. Then at least one >>> could pass dict cursor factories into psycopg2 when using the low >>> level interface. >> I lost you there. What are you trying to do? > > from django.db import connection > c = connection.cursor(cursor_factory=psycopg2.extras.DictCursor) > c.execute(BLAH) > print c.fetchone()['key'] > > For example. Django's DatabaseWrapper does not pass args and kwargs > downstream to the underlying dbapi adaptor, so the above call will > fail, even though psycopg2 will accept such a parameter. There really > should be a standard dictcursor in django which abstracts this, but > since that seemed unimplemented I tried to use psycopg2's directly. > > Also, when using cursors directly, one often has to call > connection._commit() and connnection._rollback(). This feels very > strange right now because of the prefix underscore, which usually > signifies a private method. > > I should note here that we are using django in a server application > built on Twisted (we also use it on the web app portion), so we use a > lot of these lowlevel calls because the ORM doesn't really seem > sophisticated enough to abstract complex queries where the resulting > data is not data for one of the model objects and looping through the > models generates really inefficient SQL. For lots of queries it works > beautifully though and has helped us get code running faster, even if > we have to go back and change some things to raw SQL later on. I > don't know that the _rollback/_commit functions would really be needed > by people using the ORM inside of the web app framework, but certainly > if you using the ORM on its own, those are commonly used. There's nothing wrong with using custom queries, Django does not try to put everything under the hood of its ORM. If you can propose a way to extend Django so that you can use the features of psycopg2 more directly without sacrificing usability for other backends, it would be a nice contribution! You can control transactions in detail with Django's ORM, too, for example with the @transaction.commit_manually decorator. See http://www.djangoproject.com/documentation/transactions/ Or have you already tried this? Cheers, Michael -- noris network AG - Deutschherrnstraße 15-19 - D-90429 Nürnberg - Tel +49-911-9352-0 - Fax +49-911-9352-100 http://www.noris.de - The IT-Outsourcing Company --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-developers?hl=en -~----------~----~----~----~------~----~------~--~---
