Re: Insert problems with InnoDB (big table)

2004-08-06 Thread Luc Charland
Harrison wrote: Hi, sorry about the long delay in the reply. I will be away for the next 2 weeks, but I will follow this thread if anything new comes up. Hi, A few more ideas you can try: 1. SET UNIQUE_CHECKS=0; You have a unique key that is quite large (model, id name). If you know the da

Re: Insert problems with InnoDB (big table)

2004-08-05 Thread Harrison
Hi, A few more ideas you can try: 1. SET UNIQUE_CHECKS=0; You have a unique key that is quite large (model, id name). If you know the data is already unique (ie. importing from another data source), then this can speed up the import *a lot*. 2. SET FOREIGN_KEY_CHECKS=0; You didn't mention i

Re: Insert problems with InnoDB (big table)

2004-08-05 Thread Luc Charland
[EMAIL PROTECTED] wrote: Estimado Luc, Con fecha jueves 5 de agosto de 2004, 11.07.23, escribió: Did you try disable index table? When you import millon of records there is an overload indexing it. First import and then create your index or: ALTER TABLE tb_name DISABLE KEYS; import data... ALTER

Re: Insert problems with InnoDB (big table)

2004-08-05 Thread Luc Charland
Hi Frank, you actually got me doubting here. We don't use mysql client, but I made sure that autocommit was turned off. I double checked (with select count(*) from smalltest) to see the inserts were in fact commited by chunk of 100,000 and not one by one, and it was. We still see exactly the s

Re: Insert problems with InnoDB (big table)

2004-08-05 Thread Luc Charland
any change, but I don't expect it. Luc -Original Message- From: Luc Charland [mailto:[EMAIL PROTECTED] Sent: Tuesday, August 03, 2004 7:54 PM To: [EMAIL PROTECTED] Subject: Insert problems with InnoDB (big table) We are evaluating the replacement of a Sybase database with MySQL

Re: Insert problems with InnoDB (big table)

2004-08-04 Thread David Griffiths
inserts are complete? -Original Message- From: Luc Charland [mailto:[EMAIL PROTECTED] Sent: Tuesday, August 03, 2004 7:54 PM To: [EMAIL PROTECTED] Subject: Insert problems with InnoDB (big table) We are evaluating the replacement of a Sybase database with MySQL. The databases are 60+GB, cont

RE: Insert problems with InnoDB (big table)

2004-08-04 Thread Amit_Wadhwa
Are you disabling autocommit before doing the inserts? And committing after all inserts are complete? -Original Message- From: Luc Charland [mailto:[EMAIL PROTECTED] Sent: Tuesday, August 03, 2004 7:54 PM To: [EMAIL PROTECTED] Subject: Insert problems with InnoDB (big table) We are

Re: Insert problems with InnoDB (big table)

2004-08-04 Thread Dr. Frank Ullrich
Luc, do you use the mysql client for the insert operations? And is autocommit set to "yes"? Then the answer is: turn off autocommit mode and commit every rows. Commit every 100,000 rows for example. The speeds up the whole thing a lot because there is no need for a disk flush after every record

Insert problems with InnoDB (big table)

2004-08-03 Thread Luc Charland
We are evaluating the replacement of a Sybase database with MySQL. The databases are 60+GB, containing more than 100 tables. Since we need transactions, that implies InnoDB. We were happy with the early results, but we hit a major roadblock when trying to import the biggest table (20+GB, with 4

Re: insert problems

2002-03-07 Thread BD
At 01:40 PM 3/7/2002, you wrote: >Are there any documented reasons why mysql would only enter certain fields >and "skip" others? >I am having some strange problems where it seems that data are randomly >"dropped" from inserts via web forms. > >Have posted before stating a possible cache or proxy

insert problems

2002-03-07 Thread Petre Agenbag
Are there any documented reasons why mysql would only enter certain fields and "skip" others? I am having some strange problems where it seems that data are randomly "dropped" from inserts via web forms. Have posted before stating a possible cache or proxy problem, but I am still at a loss...

Re: DBI/Perl insert problems

2001-11-24 Thread Duncan Hill
On Sat, 24 Nov 2001, Vic wrote: > #!/usr/bin/perl -w > use CGI::Carp qw(fatalsToBrowser); > use CGI; > use DBI; # Version 1.20 > use DBD::mysql; # Version 2.09 $q = new CGI; $first = $q->param('first'); $last = $q->param('last'); > print "Content-type: text/html\n\n "; > > $dbh = DBI-