On Thu, 4 May 2006, Eric Ziegast wrote:

> I think a few people in the thread gave examples of how hackers and script
> kiddies don't really need compilers on the system to do damage.  Once qn
> intruder is on your system, they can download pre-compiled tools to do what
> they need from a similar system where they (or the script-kiddies' hacker
> pimps) maintain a compiler and toolbox for the target system.  Given such an
> environment, it doesn't make a difference whether the intruded sysadmin puts a
> compiler on their computer or not.  I agree.
> 
> ### Magic numbers helped protect systems
> 
> Some friends of mine with a BSD compile-and-maintain-everything-from-source
> background came up with the idea where they would use a different Magic Number
> for binaries when they built their production systems.  The binaries on the
> production systems would all have the special magic numbers and exec would
> require the magic numbers to run the binaries.  If you tried to run a compiled
> program that was compiled elsewhere, it would fail.  They also did cool things
> like give them stealthy monitoring tools within commonly used programs (top,
> ps, shells) so that their production systems were like fish bowls where you
> could watch inept hackers swim.  Part of the policy of the production
> environment was not to have compilers on the production systems so that the
> hackers couldn't easily build a runnable program.
> 
> An 3l33t hacker might figure out that all he/she had to do was modify the
> magic number to get their program to run, but most people (including script
> kiddies) wouldn't figure it out, give up, and move on to softer targets.
> 
> This is one historical case where I think not having compilers made the system
> more secure, but it's not your standard system.

I am not impressed by any system that achieves results by some
variation of security by obscurity. There are many variations on your
theme (change syscall numbers for example), but they all cause a
maintenance nightmare.

> 
> ### Signed binaries
> 
> To kick it up a notch, one might come up with a method for replacing the magic
> number hack with a method of making sure a hash or signature of a binary
> matched a trusted source before it was allowed to run.  One would keep a
> signing-capable compiler away from the production systems and distribute
> signed binaries to the systems.  A program might start slower initially, but
> that's a trade-off a paranoid sysadmin might be willing to make.
> 
> It'd make a good Usenix paper (unless someone's already done it - anyone?).  I
> do a Google search and see that FreeBSD "updates" are signed, but I don't see
> anything about binaries themselves.  Over in linux land, there's
> cryptographically signed kernel modules.  I hear murmurs about Microsoft
> having the system verify DRM before running programs someday.

I don't think you did your resarch very well. NetBSD has veriexec, for
example.  Java also has a very elaborate class signing mechanism. A
kind of combination of a built-in systrace where the key used to sign
a class determines which operations are ok and which not. 

Now I have quite extensive experience with Java and signed classes,
and I would say that anybody proposing a signed executable mechanism
without providing a description of how to do key management is wasting
his time.

Key mananagement is the most important part. The part that
continuously will require time and attention from a lot of people, and
the part that will cause the headaches. The part where the errors
will be made.  System managers experiencing problems and needing to
get systems up and running will find ways to "make it work" and as a
result kill the protection. 

        -Otto

Reply via email to