Le 23/10/2014 01:07, Robin Sheat a écrit :
Paul Poulain schreef op di 21-10-2014 om 09:56 [+0200]:
mmm... I agree with this point, BUT, for large changes, MySQL really
does really not well with a huge single transactions. In this case,
the
only way is to do smaller transactions.

I've never noticed that, but perhaps the things I'm usually doing aren't
quite big enough. However, it's not a big deal to batch up (say) 1,000
operations and then commit/not commit them if necessary. It does lose
you some element of error recovery, but at least you can still do a real
dry-run test and it'll fail safely if something is wrong.
Yep, 1000 is not a big deal. I'm talking here of 1 000 000 of mysql operations (on a large catalogue, it's easy to reach, with constraints. Something like DELETE from biblio WHERE title LIKE "A%" will also have an impact on biblioitems, which has an impact on items, which has an impact on issues, which has ...)

Or we could run postgres :)
Troll detected :D :D


--
Paul Poulain, Associé-gérant / co-owner
BibLibre, expert du logiciel libre pour les bibliothèques
BibLibre, Open Source software for libraries expert
_______________________________________________
Koha-devel mailing list
[email protected]
http://lists.koha-community.org/cgi-bin/mailman/listinfo/koha-devel
website : http://www.koha-community.org/
git : http://git.koha-community.org/
bugs : http://bugs.koha-community.org/

Reply via email to