Hi,
I have a berkeley db table containing about 50000 rows where I do this
transaction (pseudocode follows):

begin work
delete from mytable where myfield='boo' /*delete about 100 rows*/
for (i=0; i<=100; i++){
insert into mytable values(...);
}
commit

During the insert command i get the following error:
"Lock table is out of available locks"

I tried to resolve the problem starting mysqld with -O
bdb_max_lock=60000  and later with -O bdb_max_lock=120000 but i still
receive the same error.

dbd related variables follows:

mysql> show variables like "%bdb%";
+---------------------+--------------------------------------------------------+
| Variable_name       |
Value                                                  |
+---------------------+--------------------------------------------------------+
| bdb_cache_size      |
8388600                                                |
| bdb_home            |
/var/lib/mysql/                                        |
| bdb_log_buffer_size |
32768                                                  |
| bdb_logdir         
|                                                        |
| bdb_max_lock        |
120000                                                 |
| bdb_shared_data     |
OFF                                                    |
| bdb_tmpdir          |
/tmp/                                                  |
| have_bdb            |
YES                                                    |
| version_bdb         | Sleepycat Software: Berkeley DB 4.1.24: (May 13,
2005) |
+---------------------+--------------------------------------------------------+
9 rows in set (0.00 sec)

Any hint is welcome..

Thanx in advance for the help,
Marco

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to