On Sat, May 3, 2008 6:04 am, Paul G. Allen wrote:

> Good programming practice (and good security practice) dictates that
> when a variable is instantiated, it is initialized to some value. This
> includes large blocks of memory that might hold some big blob (to use a
> DB term).
>
> So, the way Linux does it is not bad at all, but the way programmers
> fail to initialize the memory as soon as it's allocated *is* bad.
> Allocate the memory and initialize it when it's allocated, not later on
> when you *might* use it. That way, it's there up front, before the long
> computation, and there's no surprises half way through. (This is why the
> C or C++ compiler warns about uninitialized objects.)
>
>

Heh ... just flashing back to long long ago when a woman on staff at a
place I worked (very briefly) allocated/made sure she had enough disk
space by writing nulls to a file, one at a time, in a loop. It sure was
initialized! They asked the rest of us if we could punch up the
performance somehow.

Not much we could do. She had a case of ego and was the owner's wife.

-- 
Lan Barnes

SCM Analyst              Linux Guy
Tcl/Tk Enthusiast        Biodiesel Brewer


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to