i have a tiny little cgi script (one line of code) that will be
called at least once a minute by at least 1000 users of a piece of
software that i'm developing.
the user can, if they want, set the refresh time to up to an hour,
but let's consider the worst-case scenario for the purpose of this
question.
is there any benefit to creating multiple, identical copies of the
same cgi script and have the software randomly select which one it
will fetch?
something like:
/cgi-bin/myscript01.cgi
/cgi-bin/myscript02.cgi
/cgi-bin/myscript03.cgi
/cgi-bin/myscript04.cgi
/cgi-bin/myscript05.cgi
/cgi-bin/myscript06.cgi
/cgi-bin/myscript07.cgi
/cgi-bin/myscript08.cgi
/cgi-bin/myscript09.cgi
/cgi-bin/myscript10.cgi
i see this sort of thing from time time on the internet, but i've
never heard anyone explain why someone might do this. i always just
assumed that the implication is that a single file maybe has some
limitation on the number of times it can be simultaneously opened.
i don't mind setting it up like this, but it would of course be
simpler to leave it as a single script.
can someone elaborate on this subject please?
thanks.
- chase
---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
" from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]