On Mon, Jul 19, 2004 at 10:49:26PM -0400, Stephen Gran wrote:
This one time, at band camp, Michelle Konzack said:
Am 2004-07-19 10:01:06, schrieb Russell Coker:
On Mon, 19 Jul 2004 05:59, Michelle Konzack [EMAIL PROTECTED] wrote:
Thinking of the expected 50KB/sec download rate i calculated
On Tue, 20 Jul 2004 10:39, Michelle Konzack [EMAIL PROTECTED] wrote:
Other people get 10MB/s. I've benchmarked some of my machines at 9MB/s.
I do not belive it !
http://www.uwsg.iu.edu/hypermail/linux/kernel/9704.1/0257.html
See the above message from David S. Miller [EMAIL PROTECTED]
On Tue, 20 Jul 2004 20:05, Brett Parker [EMAIL PROTECTED] wrote:
(create large file)
[EMAIL PROTECTED]:~$ dd if=/dev/urandom of=public_html/large_file bs=1024
count=5 5+0 records in
5+0 records out
(get large file)
[EMAIL PROTECTED]:~$ wget
Am 2004-07-19 10:01:06, schrieb Russell Coker:
On Mon, 19 Jul 2004 05:59, Michelle Konzack [EMAIL PROTECTED] wrote:
Thinking of the expected 50KB/sec download rate i calculated a
theoretical maximum of ~250 simultaneous downloads -- am i right ?
With a 100 MBit NIC you can have a maximum of 7
This one time, at band camp, Michelle Konzack said:
Am 2004-07-19 10:01:06, schrieb Russell Coker:
On Mon, 19 Jul 2004 05:59, Michelle Konzack [EMAIL PROTECTED] wrote:
Thinking of the expected 50KB/sec download rate i calculated a
theoretical maximum of ~250 simultaneous downloads -- am i
Thanks for your advice -- seems i have been too chicken-hearted.
Summary: Don't bother with tuning the server and don't even think about
setting up a cluster for something like this - definitely overkill. ;o)
That's what i'll do ;-)
However the 50/150 concurrent requests are a guess (best i can
Henrik Heil [EMAIL PROTECTED] wrote:
However the 50/150 concurrent requests are a guess (best i can get for now)
What do you think is the request-limit with a
Pentium IV 2 GHz, 1GB RAM, 100Mbit, IDE-disk ?
Since all your files could be cached into the RAM, with a fast webserver
like thttpd a
Am 2004-07-18 13:37:03, schrieb Henrik Heil:
However the 50/150 concurrent requests are a guess (best i can get for now)
What do you think is the request-limit with a
Pentium IV 2 GHz, 1GB RAM, 100Mbit, IDE-disk ?
Thinking of the expected 50KB/sec download rate i calculated a
theoretical
On Mon, 19 Jul 2004 05:59, Michelle Konzack [EMAIL PROTECTED] wrote:
Thinking of the expected 50KB/sec download rate i calculated a
theoretical maximum of ~250 simultaneous downloads -- am i right ?
With a 100 MBit NIC you can have a maximum of 7 MByte/sec
What makes you think so?
Other
Hello,
please excuse my general questions.
A customer asked me to setup a dedicated webserver that will offer ~30
files (each ~5MB) for download and is expected to receive a lot of
traffic. Most of the users will have cable modems and their download
speed should not drop below 50KB/sec.
My
On Fri, Jul 16, 2004 at 08:53:21PM +0200, Henrik Heil wrote:
Hello,
please excuse my general questions.
A customer asked me to setup a dedicated webserver that will offer ~30
files (each ~5MB) for download and is expected to receive a lot of
traffic. Most of the users will have cable
Am Fr, den 16.07.2004 schrieb Henrik Heil um 20:53:
Hello,
please excuse my general questions.
A customer asked me to setup a dedicated webserver that will offer ~30
files (each ~5MB) for download and is expected to receive a lot of
traffic. Most of the users will have cable modems and
On Sat, 17 Jul 2004 05:42, Skylar Thompson [EMAIL PROTECTED] wrote:
As long as we're not talking about 486-class machines, the processor is not
going to be the bottleneck; the bandwidth is. Multiplying 150 peak users by
50kB/s gives 7.5MB/s, so your disks should be able to spit out at least
On Sat, 17 Jul 2004 10:39, Nate Duehr [EMAIL PROTECTED] wrote:
On Jul 16, 2004, at 1:43 PM, Markus Oswald wrote:
Summary: Don't bother with tuning the server and don't even think about
setting up a cluster for something like this - definitely overkill. ;o)
Unless there's a business
On Jul 16, 2004, at 8:28 PM, Russell Coker wrote:
Installing a single machine and hoping for the best often gives better
results.
I agree in most cases.
One possible better solution that is one step short of creating a
cluster is installing a single machine, and making sure that rock-solid
On Sat, 17 Jul 2004 14:09, Nate Duehr [EMAIL PROTECTED] wrote:
Other good ways to do this include a shared RAID'ed network filesystem
on a central box and two front-end boxes that are load-balanced with a
hardware load-balancer. That gets into the must be up 24/7 realm, or
close to it. I
16 matches
Mail list logo