Hardware is Dell PE2650/Dual Xeons (2G) 4G Ram
I have a raid10 array for data volumes.

Bottleneck would be Disk. That's the problem I had, so I went raid10 on
the box.

The most I push is about 10k, and it's not too bad. Complex queries may
suffer during insert times, but you can get around a lot of that based on
your table layout. I was using it to test live web stats in a web farm.
Developed the application, then ran into caching issues and haven't looked
at it since. MySQL did the right thing though.

P

-----"STE-MARIE, ERIC" <[EMAIL PROTECTED]> wrote: -----

To: "Peter J Milanese" <[EMAIL PROTECTED]>
From: "STE-MARIE, ERIC" <[EMAIL PROTECTED]>
Date: 01/20/2004 02:41PM
cc: [EMAIL PROTECTED]
Subject: Re: Advice needed for high volume of inserts

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On January 20, 2004 02:31 pm, Peter J Milanese wrote:
> It'll work.
>
> I do slight less on the way of inserts. What I do is dynamically generate
> the tables within my entry code, and merge tables based on the query.
Good
> for large log parsers. Be aware that this can break greatly if it's a
> non-redundant live feed (to mysql). I think that's a problem anywhere
> though. Mysql should not hold you back though.
> Peter J. Milanese

Thanks peter... Out of curiosity, what kind of hardware do you use and how
what kind of I/O do you have?

Thanks again.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.3 (GNU/Linux)

iD8DBQFADYRdQR+WnN6TbikRAqp9AJ4ycP/8a81tQoENnq48GBN9KLhhtgCeNIZ5
3vAUgqv8GA9NseXYsJt8zW0=
 =w8HR
-----END PGP SIGNATURE-----



-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to