Re: [Cryptography] The Case for Formal Verification

2013-09-24 Thread Derek Jones
Tim,

> With all due respect, most of the points you make are ridiculous.

Could you please explain why you think they are ridiculous.

> For example, you point out that the certified C compiler will not
> make any guarantees about code that relies on undefined behavior.
> Well, of course! Being certified doesn't magically fix your specification!
> Certified just says the implementation matches the specification!

Where did the word "certified" come from?  I have not said anything
about magically fixing a specification.

The CompCert people are claiming a formally verified compiler and I
think this claim is very misleading to say the least.

> And to suggest that such a project is misguided because it places
> blind trust in coq (and because it is written in ocaml?!) shows a

I did not say that this project was misguided.

> misunderstanding of the proof tools. There is a very small core of
> trust that you need to have faith in and it is backed by solid theory

Yes, the axioms.  These need to be small in number and 'obviously'
correct; something that cannot be said of the source code of coq.
The nearest I can think of is the Lisp subset written in itself in
under a 100 lines (my memory is vague on this point)

> and is a much more solid foundation for building on than just about
> any other software we have in computer science. I don't see how in
> any way you can compare the f2c translator to this effort.

Why not?  What work has been done on the coq+other tools that has not
been done on f2c?

I describe work that was done to try and show the correctness of
a Model Implementation for C here:
http://shape-of-code.coding-guidelines.com/2011/05/02/estimating-the-quality-of-a-compiler-implemented-in-mathematics/

[ My original post to this list bounced, so I am reposting it here now), it
cam ebefore the message it is replying to]


___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-09-22 Thread Tim Newsham
> The CompCert people are claiming a formally verified compiler and I
> think this claim is very misleading to say the least.

I don't find it misleading at all and I think perhaps the
confusion is your notion of what formally verified means.

>> And to suggest that such a project is misguided because it places
>> blind trust in coq (and because it is written in ocaml?!) shows a
>> misunderstanding of the proof tools. There is a very small core of
>> trust that you need to have faith in and it is backed by solid theory
>
> Yes, the axioms.  These need to be small in number and 'obviously'
> correct; something that cannot be said of the source code of coq.
> The nearest I can think of is the Lisp subset written in itself in
> under a 100 lines (my memory is vague on this point)

What I am saying is that you do not need to trust the coq code.
There is actually very little of the coq tool that you do need to
trust. That is the part of coq that performs type checking. The
type checker is what verifies proofs.  It is a small part of coq and
most of coq is not trusted and generates terms that still must
pass the type checker. Neither do you need to put a lot of faith
in the ocaml compiler, the cpu or spend a lot of time worrying about
cosmic rays flipping the bits during computation.  Yes, some flaw
could potentially lead to an incorrect computation on a certain cpu
or a certain memory bit in a certain machine during a certain computation.
But the coq type checker has been compiled on many machines and
with many versions of the ocaml library and the odds of a random error
behaving in a way that just happens to validate an invalid proof term
more miniscule (as in, I would be more worried about your spontaneously
tunneling into my livingroom due to quantum effects to resume the rest
of this conversation).

There is a small part of coq that you DO have to trust. If this part
is incorrect it could invalidate all of the proofs. However, this part
of coq is backed by strong theory and has had many eyes on it.
And while it is possible that an error here could lead to a bad proof,
it is by no means assured.

> Derek M. Jones  tel: +44 (0) 1252 520 667

-- 
Tim Newsham | www.thenewsh.com/~newsham | @newshtwit | thenewsh.blogspot.com
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-09-22 Thread Tim Newsham
> The claim of CompCert being a formally verified C compiler is wildly 
> overstated:
> http://shape-of-code.coding-guidelines.com/2013/03/10/verified-compilers-and-soap-powder-advertising/

With all due respect, most of the points you make are ridiculous.
For example, you point out that the certified C compiler will not
make any guarantees about code that relies on undefined behavior.
Well, of course! Being certified doesn't magically fix your specification!
Certified just says the implementation matches the specification!

And to suggest that such a project is misguided because it places
blind trust in coq (and because it is written in ocaml?!) shows a
misunderstanding of the proof tools. There is a very small core of
trust that you need to have faith in and it is backed by solid theory
and is a much more solid foundation for building on than just about
any other software we have in computer science. I don't see how in
any way you can compare the f2c translator to this effort.

-- 
Tim Newsham | www.thenewsh.com/~newsham | @newshtwit | thenewsh.blogspot.com
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-09-19 Thread Derek Jones
Perry E. Metzger  piermont.com> writes:

> CompCert is a fine counterexample, a formally verified C compiler:
> http://compcert.inria.fr/
> It works by having a formal spec for C, and a formal spec for the
> machine language output. The theorem they prove is that the

The claim of CompCert being a formally verified C compiler is wildly overstated:
http://shape-of-code.coding-guidelines.com/2013/03/10/verified-compilers-and-soap-powder-advertising/

It is a good start, but they still have lots of work to do.

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-08-30 Thread Jerry Leichter
On Aug 29, 2013, at 7:00 PM, Phillip Hallam-Baker wrote:
> ...The code synthesis scheme I developed was an attempt to address the 
> scaling problem from the other end. The idea being that to build a large 
> system you create a very specific programming language that is targeted at 
> precisely that class of problems. Then you write a back end that converts the 
> specification into code for that very restricted domain. If you want a formal 
> proof you have the synthesizer generate it from the specification as well. 
> That approach finesses the problem of having to validate the synthesizer 
> (which would likely take category theory) because only the final target code 
> need be validated
Many years back, I did some work with Naftaly Minsky at Rutgers on his idea of 
"law-governed systems".  The idea was that you have an operational system that 
is wrapped within a "law enforcement" system, such that all operations relevant 
to the "law" have to go through the enforcer.  Then you write "the law" to 
specify certain global properties that the implementation must always exhibit, 
and leave the implementation to do what it likes, knowing that the enforcer 
will force it to remain within the law.

You can look at this in various ways in modern terms:  As a generalized 
security kernel, or as the reification of the attack surface of the system.

Minsky's interests were more on the software engineering side, and he and a 
couple of grad students eventually put together a law-governed software 
development environment, which could control such things as how modules were 
allowed to be coupled.  (The work we did together was on an attempt to add a 
notion of obligations to the law, so that you could not just forbid certain 
actions, but also require others - e.g., if you receive message M, you must 
within t seconds send a response; otherwise the law enforcer will send one for 
you.  I'm not sure where that went after I left Rutgers.)

While we thought this kind of thing would be useful for specifying and proving 
security properties, we never looked at formal proofs.  (The law of the system 
was specified in Prolog.  We stuck to a simple subset of the language, which 
could probably have been handled easily by a prover.  Still, hardly transparent 
to most programmers!)
-- Jerry

___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography


Re: [Cryptography] The Case for Formal Verification

2013-08-29 Thread Phillip Hallam-Baker
On Thu, Aug 29, 2013 at 4:46 PM, Perry E. Metzger wrote:

> Taking a break from our discussion of new privacy enhancing protocols,
> I thought I'd share something I've been mumbling about in various
> private groups for a while. This is almost 100% on the security side
> of things, and almost 0% on the cryptography side of things. It is
> long, but I promise that I think it is interesting to people doing
> security work.
>

Whitt Diffie was meant to be working on formal methods when he came up with
public key crypto...

My D.Phil Thesis was on applying formal methods to a large, non trivial
real time system (raw input bandwidth was 6Tb/sec, the immediate
fore-runner to the LHC data acquisition scheme). My college tutor was Tony
Hoare but I was in the nuclear physics dept because they had the money to
build the machine.

The problemI saw with formal methods was that the skills required were
already at the limit of what Oxford University grad students were capable
of and building systems large enough to matter looked like it would take
tools like category theory which are even more demanding.

The code synthesis scheme I developed was an attempt to address the scaling
problem from the other end. The idea being that to build a large system you
create a very specific programming language that is targeted at precisely
that class of problems. Then you write a back end that converts the
specification into code for that very restricted domain. If you want a
formal proof you have the synthesizer generate it from the specification as
well. That approach finesses the problem of having to validate the
synthesizer (which would likely take category theory) because only the
final target code need be validated.


That is the code I re-implemented in C after leaving VeriSign and released
onto SourceForge earlier this year and the tool I used to build the JSON
Schema tool.

I would probably have released it earlier only I met this guy at CERN who
had some crazy idea about hypertext.


-- 
Website: http://hallambaker.com/
___
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography

[Cryptography] The Case for Formal Verification

2013-08-29 Thread Perry E. Metzger
Taking a break from our discussion of new privacy enhancing protocols,
I thought I'd share something I've been mumbling about in various
private groups for a while. This is almost 100% on the security side
of things, and almost 0% on the cryptography side of things. It is
long, but I promise that I think it is interesting to people doing
security work.


When I was a student the first time, in the early to mid 1980s, formal
verification was clearly a dead end that would never get anywhere. A
boss of mine once asserted (circa 1988) that there would never be a
verified program that did anything terribly interesting, and at time
he seemed right.

Today, there is a formally verified microkernel called seL4, a
formally verified C compiler called CompCert, a formally verified
experimental web browser called Quark, and lots of other stuff, much
of which I doubtless don't even know about.

_Things have changed_.

Much of what has changed is proof technology, and it is a
technology. The tools for doing formal verification are now, for the
first time, just barely usable for real work on interesting programs,
and getting better all the time. Over the last twenty five years, we
figured out a lot of stuff people didn't know before about how to
write verification tools and how to verify programs, and the results
have been impressive.


There are usually several arguments against formal verification:

1) We don't know what to specify, so what help does proving a buggy
specification do us?

2) Who would bother writing a proof vastly larger than their program?

3) You can't prove programs of interesting size anyway.

So, taking these in reverse order:

For 3 ("you can't prove anything big enough to be useful!"), the Quark
project:
http://goto.ucsd.edu/quark/
showed you don't need to prove a program of interesting size. You can
defend millions of lines of buggy code with a "software firewall" made
of formally verified code. Verify the right thousand lines of code
that the rest needs to use to talk to anything else, and you have very
strong security properties for the rest of the code. seL4 and CompCert
are clearly also quite useful programs.

For 2 ("Who would bother with all that work?"), we have libraries in
daily use like sqlite:
https://www.sqlite.org/testing.html
where the system has a fairly small amount of production code and
literally 1000 times more lines of test code than production code. If
you're willing to write ninety million lines of test to defend ninety
thousand lines of code, formal verification is totally doable.

Sure, it might not be worth it for throw away code or for your new
video game or conference room scheduler where failure isn't a big
deal, but it is *very* clear why you would want to do this for
foundational code of all sorts.

For 1 ("We'll never write a correct spec anyway, so what good is the
proof?"), I think we've been suffering from sour grapes. We didn't
have the ability to prove big things anyway, so why not tell ourselves
that there's nothing interesting and large we could prove that would
be worth proving?

CompCert is a fine counterexample, a formally verified C compiler:
http://compcert.inria.fr/
It works by having a formal spec for C, and a formal spec for the
machine language output. The theorem they prove is that the
compilation process preserves observational equivalence between the
behavior of the C program and the output, which, given correct
notation, is a very small theorem to write down.

You might claim "so what, it is probably actually buggy as hell, the
spec probably isn't really correct anyway, etc." -- except when John
Regehr's group built tools to torture test C compilers, the only
compiler they did *not* find bugs in was CompCert. They found hundreds
of bugs each in every other compiler tested. (They actually found one,
but arguably it was a bug in a Linux include file, not in the
CompCert compiler.)

Similarly, one might claim "there is no way to formally specify a web
browser that won't be just as buggy as the web browser!", but Quark's
formal verification doesn't try to show that the entire Web browser is
correct, and doesn't need to -- it shows that some insecure behaviors
are simply impossible. *Those* are much simpler to describe.

Certainly there may be other properties that turn out to be important
that no one has considered yet. However, unlike testing, if people
discovered a hole in the set of theorems being proven -- some property
that was important but which had not been considered -- then that
could be added to what was proved, and _then the problem would be gone
forever_. Verification means you get actual incremental progress that
you can trust. Testing is much less powerful. (Furthermore, future
systems can learn from what you did and add the needed theorem to what
they prove about their own system.)

I don't think the technology is up to proving huge systems correct --
a fairly unambitious C compiler or a microkernel is the current limit
-- but it is