From:                   "Peter J. Schoenster" <[EMAIL PROTECTED]>
Organization:           Schoenster
To:                     [EMAIL PROTECTED]
Date sent:              Sun, 6 May 2001 19:31:31 -0600
Subject:                RE: executing atomic transactions in DBI
Send reply to:          [EMAIL PROTECTED]
Priority:               normal

Sorry to interfere so late in this thread, but

> Ummm ... maybe difference in environment requires different 
> methods ?  I got my code from below or from the book and I tested 
> it with PostgreSQL ... it worked ... successful changes in the eval 
> were rolled back when one update/insert  failed in the same eval.
> 
> I didn't mention the need to have RaiseError on in my example 
> (crucial oversight).
> 
> I'd like to see the code that implements multi inserts/updates  
> different than below.
> 
> http://theoryx5.uwinnipeg.ca/CPAN/data/DBI/DBI.html
> 
> > The recommended way to implement robust transactions in Perl
> > applications is to use RaiseError and eval { ... } (which is very
> > fast, unlike eval "..."). For example: 
> > 
> >   $dbh->{AutoCommit} = 0;  # enable transactions, if possible
> >   $dbh->{RaiseError} = 1;
> >   eval {
> >       foo(...)        # do lots of work here
> >       bar(...)        # including inserts
> >       baz(...)        # and updates
> >       $dbh->commit;   # commit the changes if we get this far
> >   };

if the commit() is placed here, won't it be always executed, because 
there are no die statements which stop code execution upon errors 
before ? Shouldn't committing be made dependent of the value of $@ 
like

if ($@) {
      warn "Transaction aborted because $@";
      $dbh->rollback; # undo the incomplete changes
     # add other application on-error-clean-up code here
}
else {

        $dbh->commit();
}


Bodo

[EMAIL PROTECTED]

Dr. med. Bodo Eing
Institut fuer Medizinische Mikrobiologie
Klinische Virologie
v.-Stauffenbergstr. 36
48151 Muenster
Germany

Phone: ++49 251 7793 111 Fax: ++49 251 7793-104

Reply via email to