On Tue, Jul 13, 1999 at 03:01:58PM -0500, Bruno Wolff III wrote:
> On Tue, Jul 13, 1999 at 01:41:19PM -0400,
>   "Adam D . McKenna" <[EMAIL PROTECTED]> wrote:
> > 
> > I seriously doubt that a majority of users will be using public key
> > encryption anytime soon.  Encryption went from being something hard to use to
> > something you have to pay to use.  Only the users that demand secure e-mail 
> > will be using encryption.
> 
> PGP for personal use has been free for a long time. RCF 2015 has been
> around for a few years as well. There is also an Open PGP standard
> that is nearing finalization. The main reason we already aren't seeing
> most people using real encryption in their email is that the US Government
> is discouraging it, so they can continue to easily read people's email.

How are they "discouraging it"?  I haven't gotten any notices in the mail
from the government saying "please don't use strong encryption".  All of the
major e-mail clients have a way of integrating strong encryption, it's just
that the two most popular (Netscape and Outlook) only work with S/MIME and
not PGP, which you need to buy a certificate for.  Also, you hit upon the key
word.  "Personal use".  Business e-mail is not a personal use.  And I think
the people on this list that are concerned with scanning e-mail are much more
concerned with scanning business e-mail than personal e-mail.

> > > In the shorter run, viruses will be developed that use a simple encryption
> > > each time they transmit themselves in order to keep the fixed part of the
> > > virus small in order to make virus detection more difficult. They may
> > > also use a number of varient codes to do the decryption part so that even
> > > that may vary with each copy.
> > 
> > There are already many variants of many common viruses.
> 
> We are talking about 'many's that are orders of mangitudes apart. With
> encryption, each copy of a virus will be different. There will have to
> be a small relatively constant part, but that can be giving a large amount
> of variability by having alternate code that does the same thing for
> small pieces of the bootstrap part of the program. This is a lot different
> than having just a few thousand viruses to check for.

Yes, scanning engines are going to have to get smarter and smarter to
maintain their usefullness.  Is there a point I'm missing?

> > > Another problem is that virus checking is going to take more and more time
> > > as the number of viruses that have ever been written increases. Virus
> > > scanning just can't work in the long run.
> > 
> > How do you propose viruses be detected then?  What will "work in the long
> > run"?  I suppose we should just ask the malicious hackers out there to just
> > "stop" making and distributing viruses.
> 
> What will work in the long run is real security such as capability systems.
> In the short run teaching people not to run programs given too them
> by people who are either clueless or untrustworthy is a good start.

Yes, but it's not realistic.  No matter what you tell someone, if their
best friend sends them an email with an executable in it saying "this is
cooool!!!!", the person is probably going to run it.

Perhaps you could explain what a "capability system" is.

> > > The other question is why this is being done on the mail server instead of
> > > on the end user machines, where there is likely to be a lot of underused
> > > CPU power?
> > 
> > Where I work we run VirusScan on the workstations and NetShield on the
> > servers.  Guess what, the servers catch way more viruses than the
> > workstations do.  Why?  Because it's a hell of a lot easier to upgrade 10
> > servers than it is to upgrade 800 workstations every time there is an update
> > from McAfee.  Yes, we could start AutoUpdate on every workstation if we had
> > the manpower.  But there will always be some machines that fall through the
> > cracks.
> 
> The antivirus people need to improve the way they do things. Viruses are
> spreading much faster now than they used to and having to have people go
> and look to see if there is a new update once a week or so isn't good enough.
> Probably the best solution is a distributed one, where information is
> pushed to a local server when there is a change and all local machines
> check with that server for updates everytime they are about to do a scan.
> 

Yes, we do that here, but like I said, all of the local machines need to be
configured to use this repository.  They also have to be equipped with the
latest software.  This is all well and good until the VP calls at 7pm on a
friday saying he needs a laptop because he's leaving in 2 hours for a
conference.  (Yes, this has happened to me.)

--Adam

Reply via email to