So, I should be more specific with my question: to the end of determining strictly which webserver is more efficient do you see any problems with this type of setup?

More 'efficient' I think you mean.

Is that not exactly what I said?


And another question: how would you do it differently? Sure, in an ideal world I could assemble my own botnet and then blast my corporate network with a gigabit of distributed traffic multiple times for each webserver -- but obviously in the real world that's not going to happen.

The question you have to ask yourself is what are you wanting to test? A completely artificial metric with no real-world correlation? If so, then you're on the right track.

Instead of dancing around the issue can you please provide some suggestions? Or do you just like to be contrary?

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org

Reply via email to