Actually I would characterize it as

Before:

The programmer had no control over the buffer size, and the user of
the code had no way of adjusting the buffer to a particular system.

Currently:

The programmer has control over the buffer size, and the user of the
code can adjust the buffer to a particular system.


Setting the buffer size is better done by the user, not the
programmer. Often the user and the programmer are one and the same, in
which case, the programmer knows the environment and can set the
environment variables- or change the code- whichever makes better
sense.

If you're writing code for others to use, then the optimal buffer size
isn't known by you. The programmer can either leave it alone and let
the user set an environment variable; or can hard-code it to some
fixed value- for example a multiple of a fixed record size- or can
have code scale the dynamic variable (which is either the default or
set via the user's environment)- for example because the code forks N
times and you've noticed it performs better with a buffer 1/Nth the
usual size (purely hypothetical)

When I hear "Every program that was made under the previous paradigm
now needs to be modified to check the environment to avoid undesired
side effects" what I think is "no, every program that cares can say
INIT $*DEFAULT-READ-ELEMS=65336 thus ignoring the environment. But if
someone gave me a module or program that ignored my wishes, I'd edit
it away."


There are times when you want to ignore the environment - like in Perl
5's taint mode, which if I recall correctly, clears $ENV{PATH} and a
few other things. But in general code uses bits of the environment
because the user wants it that way. If the user is fiddling with
buffer size, then the user knows something or id debugging something
about the system which the programmer didn't need to think about.

Reply via email to