In October 2001, we discussed a security bug policy for mozilla.org, which resulted in the current policy. I was quite unhappy about the policy, with the worst problems listed in the attached post. I also included Mitch's reply.

However, the policy very much reflected Netscape's interestes, probably because Netscape was such a big contributor back then and Netscape employed the security module owner.

As I understood from later private comments, I wasn't alone in my opinion even within mozilla.org, and definitely not at large, although I was pretty much alone in the public discussion. The secrecy with which we deal with bugs may have supported Linux distributors and other vendors in being incredibly careless about updating the browser (Debian stable still ships Mozilla 1.0.0 (!!!) with more holes than swiss cheese stolen by a bunch of mice). There's also been public punishment for that, see attached mail.

The policy isn't working. Some problems and facts:
  • Users have no idea about the security of their browser, most assume there are no holes
  • There's no announce mailing for critical security problems and fixes to alert users
  • Public security bug lists are generally not current, due to neglectance and lack of process, and per policy only list *fixed* bugs anyways.
  • Known (to mozilla.org), but unfixed bugs are in Bugzilla, but can't be seen by normal mortals and most developers, per policy. Same for fixed ones, until a bigger release is out and 'important distributors' (used to be codename for 'Netscape') issued updates.
  • The relation of the number of critical, hidden bugs to security bugs reported by the press is 10:1 by my observation
  • The known, hidden security bugs are usually not being fixed timely (contrary to assertions by Mitch during the policy discussion IIRC). Some critical ones rotted for years until they were driven out. There are currently 59 hidden, unfixed bugs. The by far oldest one, a spoofing bug, is from 1999; none from 2000/2001; about 40% are from 2002; 90% are from 2003 or earlier.
  • Bugs reported about by the press are usually fixed quickly (in Bugzilla, then CVS), often within 1-2 days.
  • The releases recommended by mozilla.org are usually *not* supported with security fixes, thus contain known security bugs later during their lifetime. Only for the most high-profile ones reported widely by the press are new versions (1.x.1) of the browser being released, and even that often with huge delay.
  • Meanwhile, the situation seems to be better, so good that mstoltz could announce in (IIRC) September that no more unfixed, critical security bugs are known (incl. hidden ones). I could confirm confirmed that (back then), with a few open questions.
I would define "critical" as allowing arbitary code execution (allowing full access and control of computer) or reading local files (allowing full access to all data on computer). Maybe also same-origin bugs (allowing to use login to online banking site from a third party site).


So, given that Netscape is no more, can we use full disclosure now?

In case that isn't being accepted and partially in addition to that, I propose the following changes to the policy and procedure:
  • Response team: At an given time, somebody from the security group is responsible for evaluating and treating incoming critical security bug reports via Bugzilla and email, within hours or at most one day, and adding that information to the Bugzilla bug. This includes
    • asserting the severity
      • worst-case threat
      • migitiating factors preventing exploit
      • potentially vulnerable userbase
    • reproducing the bug
    • possible cause and culprit
    • workaround
    • assigning the bug to a developer and alerting him (preferably by phone)
    • writing an announcement to the public mailing list (see below)
    Stuff like "I think I found a security bug: ZoneAlarm reports about Mozilla opening ports" can be ignored ;-).
  • Assignee: In the past, security bugs were often assigned to mstoltz, the security module owner, I think even when the bug wasn't in his code. I'd propose that bugs are generally assigned to the developer who caused it and requiring him to do no other Mozilla development until the bug is completely fixed. Security bugs are far worse than breaking the tree! This puts pressure on developers to fix security bugs quickly and to prevent them to appear in the first place. As for the caps module, we'd have to find a solution to distribute workload (and knowledge, if needed).
  • Checkin: Review requirements for critical and immediate security bugs are much easier on the normal criteria (can hardly get worse - esp. no nits about style, speed etc.), to allow fast checkin of available patches, but reviews should still carefully check the correctness of the fix from a security standpoint. More thorough review for normal coding standards can happen later after checkin. Compare burning trees - no formal reviews needed either. Any checkin limitations (freeze etc.) are lifted.
  • Firedrill: Once a week, the list of security bugs is reviewed for any critical bugs, and pressure is put on assignees of unfixed, critical bugs. No release with critical security bugs (I think that's feasible now, given that we should be at or close to zero critical bugs).
  • Forced disclosure: If a bug is left unfixed for a certain time (e.g. 2 weeks) without a really good reason ("developer was busy" is none, "extraordinarily hard to fix" is one), the Bugzilla bug is disclosed. This encourages fast fixes, prevents uncritical bugs from rotting in secrecy, and removes most of the blame we see for hiding bugs.
  • Release team:
    • Official releases are being kept on the offical release build machine as full build tree, so that security patches can be quickly applied and depend-compiled and minimal updates in the form of a single DLL/jar in a XPI can be created.
    • Skeleton XPIs for DLLs and jars each can be created (simple, I can do that) and it should be simple to fill them with a single file.
    • An update page similar to what I use at Beonex (automatically determining the right XPIs based on browser version and platform, see [1] and *) can be used to allow an easy and quick 2-click update with all applicable XPIs at once for users. (I can create that page as well.)
    • The Update Notification feature (Prefs|Advanced|Software Installation) is used (in addition to the announce mailing list) to alert users about the XPIs.
    • Full builds with the fixes are pushed, with a version number 1.x.x.n, where 1.x.x is the version used generally to identify a release and n is the build number used only on the filenames and install page. This avoids that new users have to install XPIs right after the fresh install or even miss the updates, while keeping the workload for mozilla.org low.
    • Testing coverage on these security releases is only minimal and basic, to be done by the release engineer within an hour, not days or weeks.
    • XPIs should be ready within hours of the checkin of the fix.
  • Announce mailing list: A mailing list, to which everybody can subscribe (but only the security group members and release team can post), is being created and advertized in prominent places where all users should see it. It announces
    1. critical security bugs and workarounds as are they are found
      (to be done by the response team right after the evaluation, with disclaimers about correctness of information)
    2. binary fixes for those bugs
      (to be written by response team and sent by response or release team)
  • Bugzilla flags: A flag for critical security bugs, which can be applied to both hidden and disclosed bugs.
I would suggest to put this policy in place and operation within the next month.


I would probably volunteer for response and release teams.



[1] <http://www.beonex.com/communicator/version/0.8/add-ons/security/png-2002-08-11.html>
<http://www.beonex.com/communicator/version/0.8/add-ons/binaries/flash/>
* There's a small problem: I don't know how to determine, which patches are already installed, without exposing patch status to attacker sites

------
Mail to security-group:
--
<http://translate.google.com/translate?hl=en&sl=de&u=http://www.heise.de/newsticker/meldung/45443>
<http://translate.google.com/translate?hl=en&sl=de&u=http://cert.uni-stuttgart.de/ticker/article.php?mid=1183>

The RUS-CERT at Germany University Stuttgart released an Advisory, warning that Bugs in Mozilla are being fixed silently, without warning users apart from a useless "Several security-related bugs were fixed in 1.6". Linux distributors don't update their packages, leaving users exposed.

Quote heise:
"... comes to a destroying judgement about the security of the Open-Source browser"
then quoting the advisory:
"At the moment, Mozilla is at least in security questions obviously no convincing altlernative to the market leader."
which was later being updated with:
"We merely wanted to point out that Mozilla isn't a solution either for the security problems by which currently all clients are pleagued"

A forum post (in response to the heise story) told that the background to the advisory is appearantly a recent debate on the open Debian security mailing list that Debian stable still ships 1.0.0 (as I pointed out before).
--- Begin Message --- Thanks to Frank and Mitch to finally open up the security bugs a bit more. It is certainly a big improvement over the current scheme, which didn't work at all. Also many thanks for seriously considering my complaints about the policy and adjusting it in some places.

Nevertheless, the fact remains that the policy is far from my point of view on this matter. I'll summarize here, for the record. The main remaining problems are:

   * It is not garanteed that users will be warned about all severe
     security bugs. In particular, there are classes of bugs which
     Mitch said he would not even want a warning about.
     (A "warning" here is a vage, public discription of the bug
     (without reproduction info), which allows users to judge their
     risk and take counter-measures.)
   * It is unclear, how much freedom distributors have while forwarding
     mozilla.org's warnings (those that *are* issued) to their users.
   * There is no garantee that bugs will be fixed timely. My approach
     would have been to force the disclosure of unfixed bugs after a
     certain time (e.g. 2 weeks) after reporting, with exceptions, if
     it was not realistically possible to fix the bug during that time.
   * The time between a bug being fixed and fully disclosed might be
     regularily very long (half a year or more).


Although I am not comfortable with the policy, I will participate in the security bug group, if allowed to, because I have not much to lose* by doing so and more to gain.
If my time and energy permits, I will try to act as a connection between the security-conscious people not in the security bug group and the group and to act as a voice for openness within the group. However, my time and enery is limited, so please to not rely on me.


I invite everyone seriously interested in security to apply as member in the security bug group and help fix and evalute the bugs and to make a case for openness.

Ben



--- End Message ---
--- Begin Message --- Ben,
Thanks for your input and for agreeing to participate. Believe me when I say we did take your comments seriously. Let me sum up my responses to some of your points, for the record.



   * It is not garanteed that users will be warned about all severe
     security bugs. In particular, there are classes of bugs which
     Mitch said he would not even want a warning abou

     (A "warning" here is a vage, public discription of the bug
     (without reproduction info), which allows users to judge their
     risk and take counter-measures.)

I'm afraid we're going to have to agree to disagree on this point for now. Let's see what happens with the current policy for a few months. If I'm not getting pressure from Netscape to release less information, then maybe we can move towards more warnings. No promises though. Loyalty to our respective organizations aside, I honestly think that releaseing even a vague warning for every single bug that goes into the security group, even if there's no workaround, even if the bug isn't exploitable on its own, is not in the best interests of the vast majority of our users. That's my story and I'm stickin' to it.


We should at least agree that any disclosure by one distributor or security group member is the same as disclosure by all. Your earlier comments support this.


   * It is unclear, how much freedom distributors have while forwarding
     mozilla.org's warnings (those that *are* issued) to their users.

My apologies - I thought that point was clear. You can use the warning posted to www.mozilla.org/projects/security/known-vulnerabilities.html. You can change the wording as you see fit, but you can't add any information. Again, disclosure by one is disclosure by all.



   * There is no garantee that bugs will be fixed timely. My approach
     would have been to force the disclosure of unfixed bugs after a
     certain time (e.g. 2 weeks) after reporting, with exceptions, if
     it was not realistically possible to fix the bug during that time.

There is no guarantee that any bug will be fixed timely, period. Any fixed time limits are simply not reflective of reality. The beauty of open source is that you don't have to wait for Netscape to fix a bug - if that bug is important enough to you, you can fix it yourself or pay someone to fix it, in two weeks or two hours.


You might as well drop this point because it's completely unrealistic and we will never agree to it.


   * The time between a bug being fixed and fully disclosed might be
     regularily very long (half a year or more).

As you said to me on the phone, if we set an arbitrary time limit of one year, someone will come along and say "well, why not six months?" "why not one month?" It's a slippery slope. A fixed and arbitrary time limit is simply not necessary. Bugs will be opened to the public in a time frame that you and I both consider reasonable. As module owner I will see to that.



If my time and energy permits, I will try to act as a connection between the security-conscious people not in the security bug group and the group


Just make sure your 'connection' doesn't violate any confidentiality. When in doubt, ask the group first or ask that these "security-conscious people" be added to the group.


I invite everyone seriously interested in security to apply as member in the security bug group and help fix and evalute the bugs


I second that.
     -Mitch


--- End Message ---


Reply via email to