On Monday 28 July 2008 09:47:29 am Michael Torrie wrote:
> Alberto Treviño wrote:
> > "Security by obscurity" is a huge misnomer.  ALL forms of security must
> > rely on obscurity.  Even the venerable SSL standard and its
> > corresponding protocols rely on a SECRET or private key for their
> > security.  If that "obscure" bit of data is leaked, SSL security is
> > brought to naught.
>
> No, this is not a correct analogy.  SSL is based on the idea that the
> public key is known to the world and not secret.  The SSL private key
> may indeed be an "obscure" secret, but for SSL to function the private
> need not be revealed at all, ever.  See the difference?

What you are stating is the difference between Symmetric Cryptography and 
Asymmetric Cryptography (or Public-key Cryptography).  Both of them require 
a secret.  The only difference is who needs to know the secret.  By having 
public/private keys, asymmetric cryptography makes it easier to manage the 
secrets because the clients use the public key to encrypt message instead of 
the private key.

> With security
> by obscurity, you're relying on the fact that only the party you wish to
> communicate knows some obscure secret configuration or sequence.  But
> the other party has to know it in order to communicate.

That was exactly my point.  The term "Security by Obscurity" is used to 
describe poorly designed security kept secure because the design and flaws 
are hidden from the public eye.  Its counterpart is "Security by Design" 
which states that even if you know how to break it, you can't break it 
unless you know they *key* elements which make it work (aka, the secrets, 
whether symmetric or asymmetric).

However, the term "Security by Obscurity" also implies that anytime you keep 
something obscure (or secret) you are only pretending to be secure.  In my 
opinion, that is not the case.  If you don't hold a secret and everything is 
public, then there is no security.

> If we are to
> use your analogy, then security by obscurity is requiring a shared key
> that multiple parties may or may not know.  With PKI, it doesn't matter
> that the entire world knows your public key, because as long as you know
> your third party's public key, you can encrypt a message that is
> guaranteed to only be readable by him or her, and is guaranteed to be
> from you.  That is the basis of a secure algorithm.  Security really
> means that even if you know the mechanism and can see the communication,
> you cannot intercept it.

And why can't you intercept it?  Because you don't know the key being used 
to encrypt.  Yes, the client uses the public key to encrypt the message.  
But what good is an encrypted message that can never be decrypted?  That's 
where the (secret and obscure) private key comes in.  During TLS handshake, 
the client and server negotiate the "master secret".

> Security by obscurity is an attempt to hide
> something that is fundamentally insecure such that the system appears,
> fraudulently, to protect users and data.

Yes, that is the correct definition of "Security by Obscurity".  However, it 
is a misconception that any time you apply obscurity to make something 
secure you are not truly providing security.  All security has to rely on 
some form of obscurity.  How secure something really is depends on how 
difficult it is to extract the secret.  The point I wanted to make is that 
adding extra obscurity to a process (like changing a port number) provides 
an extra (although somewhat weak) level of security.

-- 
Alberto Treviño
BYU Testing Center
Brigham Young University

--------------------
BYU Unix Users Group 
http://uug.byu.edu/ 

The opinions expressed in this message are the responsibility of their
author.  They are not endorsed by BYU, the BYU CS Department or BYU-UUG. 
___________________________________________________________________
List Info: http://uug.byu.edu/mailman/listinfo/uug-list

Reply via email to