RE: MaxDB performance versus Microsoft SQL performance

2004-05-25 Thread Samir Mishra
king into using MaxDB ourselves. Thanks. Samir. -Original Message- From: Arsen Pereymer [mailto:[EMAIL PROTECTED] Sent: Thursday, May 20, 2004 19:16 To: [EMAIL PROTECTED] Subject: MaxDB performance versus Microsoft SQL performance Hello, Currently, we have a large Microsoft SQ

RE: MaxDB performance versus Microsoft SQL performance

2004-05-21 Thread Becker, Holger
Arsen Pereymer wrote: > Currently, we have a large Microsoft SQL 2000 Database > containing over 100 > tables with over 100+ million rows. The purpose of the > database is to house > a datawarehouse. It currently takes about 20hrs to run the > ETL to populate > the database. > > We have been d

RE: MaxDB performance versus Microsoft SQL performance

2004-05-20 Thread Zabach, Elke
Arsen Pereymer wrote: > > Hello, > > > Currently, we have a large Microsoft SQL 2000 Database > containing over 100 > tables with over 100+ million rows. The purpose of the > database is to house > a datawarehouse. It currently takes about 20hrs to run the > ETL to populate > the database. >

Re: MaxDB performance versus Microsoft SQL performance

2004-05-20 Thread Mark Johnson
Another thing to try is to turn off auto-commit and commit the transaction at the end. Although this number of records may fill the transaction log and the inserts may need to be done in smaller chunks. Our application (not as many inserts) improved greatly by turning off auto-commit and comm

Re: MaxDB performance versus Microsoft SQL performance

2004-05-20 Thread John L. Singleton
Hi Arsen, For any sort of huge import operation like that, it would be a good idea to turn off your logs temporarily. You can do this with a series of commands like this: dbmcli -d -u > db_admin > util_connect > util_execute SET LOG WRITER OFF > util_release Obviously then, you can turn your l

MaxDB performance versus Microsoft SQL performance

2004-05-20 Thread Arsen Pereymer
Hello, Currently, we have a large Microsoft SQL 2000 Database containing over 100 tables with over 100+ million rows. The purpose of the database is to house a datawarehouse. It currently takes about 20hrs to run the ETL to populate the database. We have been doing research and found that using