Jump to content

Welcome to Geeks to Go - Register now for FREE

Need help with your computer or device? Want to learn new tech skills? You're in the right place!
Geeks to Go is a friendly community of tech experts who can solve any problem you have. Just create a free account and post your question. Our volunteers will reply quickly and guide you through the steps. Don't let tech troubles stop you. Join Geeks to Go now and get the support you need!

How it Works Create Account
Photo

A Time to Patch II: Mozilla


  • Please log in to reply

#1
coachwife6

coachwife6

    SuperStar

  • Retired Staff
  • 11,413 posts
A Time to Patch II: Mozilla

Brian Krebs on Computer Security
Washington Post
Feb. 7, 2006

A few weeks back, Security Fix published figures showing just how long it takes for Microsoft to issue security updates for flaws it learned about through private or public disclosure. I don't recall receiving as much feedback from readers after any other blog post, and many who wrote in wondered whether we were going to conduct the same analysis for other major software vendors.

Today's post marks the second in what I hope will be a series of similar analyses. This one looks back over a similar three-year period to see how long it took Mozilla to issue patches for self-assigned "critical" security holes in its various open source products, including the Mozilla Suite, the Firefox Web browser, and its Thunderbird e-mail software.

I went into this project well aware of the difficulties that could arise in trying to compare patching processes of very different software developers, but a couple of thoughts kept me going. First off, the data were there, and I couldn't find evidence of anyone having gone to the trouble of pulling it all together neatly. Also, I wanted to better understand the patch process from the open-source community's perspective.

At first, I thought this would be a simple project compared with compiling the data for the Microsoft time-to-patch charts (which took about six weeks). In all, my correspondence with Mozilla staff and examination of Mozilla's various vulnerabilities pages, release announcements and individual bug reports took about four weeks.

I found that over the past three years, Mozilla took an average of about 37 days to issue patches for critical security problems in its products. Below are three spreadsheets detailing our findings for the past three years. The data sheet is downloadable either as a Microsoft Excel file or regular HTML file.

Download Mozilla.htm
Download mozilla.xls

I must insert a strong caveat here, however. The 37-day average is skewed mightily by a flaw found in various Mozilla products that potentially allowed malicious Web sites to trick users into accepting security dialog boxes -- a flaw which Mozilla took 674 days to fully remedy. This was a vulnerability that apparently existed in all browsers. (Microsoft got around to fixing an identical flaw -- which it labeled a "moderate" security risk -- in December.)

According to Chris Hofmann, Mozilla's director of engineering, the fix was delayed in part by speculation that it could cause the browser to constantly pop up annoying alert dialog boxes. But Hofmann noted that the early beta releases of Firefox in March 2004 closed off the problem as originally defined by the guy who discovered the flaw (Jesse Ruderman, who was since hired on as a full-time researcher at Mozilla).

With that flaw left out of the data, Mozilla took an average of 23 days to develop and incorporate fixes. And even this lower average does not give a clear picture of Mozilla's typical response time. In the past three years, Mozilla produced roughly one-third of its critical security updates within less than 10 days of being notified of a potential problem.

Even with the 674-day-old flaw included in its time-to-patch figures, Mozilla fixed critical issues in its products much faster than did Microsoft, which -- over the past three years -- took more than three times as long on average to fix critical flaws in its Windows software.

Look back through the bug reports for each of the critical vulnerabilities we listed and you will see that Mozilla developers generally verified critical flaws and offered fixes for them just days after they were reported. But most of these fixes were useful only to experienced computer users and researchers lending a hand to improve the open-source software, people who typically have no trouble recompiling software fixes on the fly.

Tell the average Microsoft Windows user who has migrated from Internet Explorer over to Firefox that he needs to recompile the program in order to accommodate new security fixes and you’re likely to be met with a blank stare. Mozilla recognized this, and understood that the majority of these users would get security updates only when the company made a new version of the browser available.

In recognition that 2004 was most likely the first year in which a significant share of the company's new user base was coming from Windows users, Security Fix based each of "date patch issued" date for 2004 and 2005 on the release date of the next product update that incorporated the fix for that critical security vulnerability -- not the date on which a fix was available to developers. For 2003 critical Mozilla flaws, Security Fix relied on the times listed in the "date fixed" field for each critical flaw listed on the "Older Vulnerabilities in Mozilla Products" page.

Mozilla also had relatively few cases of security researchers disclosing critical flaws to the public instead of privately to Mozilla; this happened only a couple of times in 2005 (Security Fix will release more data on Thursday that address this issue in much further detail). Mozilla took an average of 16 days to release critical software updates after flaws were publicly reported, while Microsoft averaged two months in the same situations.

I can only speculate as to why Mozilla has had to deal with fewer flaws reported via full disclosure, but the folks over at Belgian security-auditing company Scanit.be pointed to a couple of possibilities last year when they posted "A Year of Bugs," which included some interesting statistics comparing the number of days Firefox and IE were vulnerable to known security flaws and malicious code in 2004:

"Mozilla is enjoying some advantages concerning the public disclosure of vulnerabilities. Security researchers seem to be more inclined to report vulnerabilities privately to the Mozilla development team rather than publish them immediately. This might be because the Mozilla project produces free open-source software, and being nice to it is considered A Good Thing, or possibly also because of Mozilla's Security Bug Bounty Program that pays $500 to users reporting critical security bugs."

Dan Veditz, another security researcher at Mozilla, said problems discovered by open-source community members are less at risk for early disclosure, but that "unconscionable delays in fixing bugs will get criticized in public, which is both embarrassing and may discourage future reporters from going the 'responsible disclosure' route with us."

"These reporters obviously fall into the 'responsible disclosure' camp, or they'd have gone public to start. But if they're not seeing progress that indicates we're upholding our end of the bargain, they could well go public, and then we've got a full-blown emergency," Veditz said.

Veditz and other Mozilla researchers found themselves in emergency mode in September when researcher Tom Ferris published his findings just four days after notifying Mozilla about a critical flaw in Firefox 1.0.6 that could let attackers take complete control of computers using the browser. The exploit code Ferris released was laughably simple (basically just a URL followed by a string of dashes), but the public disclosure nonetheless forced Mozilla to rapidly accelerate its fix process.

Microsoft's market reach (Windows is the operating system for roughly 90 percent of the world's computers, and IE still commands at least 85 percent of the browser market) has always made it Target #1 for virus writers and online criminal groups. Naturally, security researchers can also make a big splash by discovering and reporting problems in such widely used software.

But I wondered if there was something in the data we collected to suggest that open-source vendors react more nimbly than those that do not open their blueprints to researchers. These two time-to-patch data sets hardly represent an exhaustive search for a definitive answer to that question, but the differences between the two sets of data certainly are stark enough.

While many open source advocates may consider it a truism that open source software vendors fix flaws faster than closed-source companies, I could find surprisingly little empirical data or studies to back that up.

While I was wrapping up this post, I came upon a study published last month by several researchers at Carnegie Mellon University in Pittsburgh that examined the degree to which public and private disclosure of software vulnerabilities affects how quickly vendors fix the problems. The researchers examined some 438 vulnerabilities in programs made by 325 software vendors and found the patching speed of open-source vendors was roughly 60 percent faster than that of the closed-source vendors looked at in the study.

One final note: This analysis is something of a work in progress. I have double-checked the facts and figures in this table several times (as have a few of the Mozilla team members) but that does not mean the charts are completely devoid of mistakes that I have somehow overlooked. If you find a discrepancy with any of the data linked to here, please do not hesitate to let me know, either through e-mail or via a comment on the blog. If I can verify your findings, I will not only fix it but publicly acknowledge your contribution on the blog.

http://blogs.washing...?referrer=email
  • 0

Advertisements







Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP