Aleph1 wrote: > A flaw in the standard not on the stack. RFC 1122 "Requirements for > Internet > Hosts -- Communication Layers" covers this issue although without > pointing > out its security consequences. In the case that a host is not routing, it is abundantly clear that the strong model is the only correct one. Similarly, I would argue that in the case that a host is routing, the weak model is clearly correct. In more complex cases, one should use packet filtering to enforce requirements. You'll note that RFC 1122 is completely silent on the difference between routing and non-routing hosts, which makes it so broken it seems almost irrelevant on this issue. But the really important point in our advisory which is independent of these considerations is that packets can be delivered to localhost, err, remotely. RFC 1122 is actually somewhat flawed in this respect, too, in that it makes it a requirement that 127/8 packets should not appear on the network, without specifying the behaviour of hosts upon receipt of such an illegal packet (which by the default "be liberal in what you receive" rule should handle them gracefully, rather than sensibly discarding them). But the bottom line is this: regardless of RFCs, it is clear how the systems should be expected to behave, and if they don't behave that way, they are broken. If the RFCs also make it "legal" for them to be broken, then they, in turn, are also broken. So, yes, RFC 1122 "covers" this issue, but so hopelessly it is hardly worth considering. I have to wonder what the authors were thinking of! Cheers, Ben. -- http://www.apache-ssl.org/ben.html "There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit." - Robert Woodruff ApacheCon 2001! http://ApacheCon.com/