Re: [SC-L] Comparing Scanning Tools
Hi James, The point is going back to your original question -- "I wonder if budgets and the tools themselves are really causing more harm than helping in that enterprises will now think about trading off such tools vs the expense they cost." -- the economic comparison needs to take into account the tradeoff not just the expense of the tool, developer productivity, and bug remediation early v. late, but also the breach itself has a cost when those bugs that are not dealt are eventually exercised. So I don't care if you don't like the Gartner numbers, you can use others to weigh the cost of the breach (Ponemon's are higher actually), but whatever number you choose to use should be factored in to your model to account for this. It may not be helpful if not scrubbing in allows your surgeons to operate on more patients if they are killing them faster due to infections they cause. -gp On 6/8/06 9:15 AM, "McGovern, James F (HTSC, IT)" <[EMAIL PROTECTED]> wrote: > Several thoughts: > > 1. I love it when industry analysts are perceived as being credible by > throwing out financial costs for things they really don't have visibility > into. > > 2. The VA lost data not do secure coding techniques but an employee not > following the rules on what data to take out of the building. > > 3. No industry analyst has ever attempted to quantify cost vs benefit of > secure coding when compared to other constraints. The quantification to date > has only been the cliche: it is cheaper to fix X earlier in the lifecycle > rather than later in which X could be pretty much any system quality. > > > > -Original Message- > From: Gunnar Peterson [mailto:[EMAIL PROTECTED] > Sent: Thursday, June 08, 2006 9:28 AM > To: McGovern, James F (HTSC, IT) > Cc: Secure Mailing List > Subject: Re: [SC-L] Comparing Scanning Tools > > > Hi James, > > I think you are right to look at it as economic issue, but the other factor > to add into your model is not just the short term impact to developer > productivity (which is non-trivial), but also the long term effects of > making decisions *not* to deal with finding bugs. > > "Cleaning up data breach costs more than encryption > > Protecting customer records is a much less expensive than paying for > cleanup after a data breach or massive records loss, research company > Gartner said. Gartner analyst Avivah Litan testified on identity theft > at a Senate hearing held after the Department of Veterans Affairs lost > 26.5 million vet identities. "A company with at least 10,000 accounts to > protect can spend, in the first year, as little as $6 per customer > account for just data encryption, or as much as $16 per customer account > for data encryption, host-based intrusion prevention, and strong > security audits combined," Litan said. "Compare [that] with an > expenditure of at least $90 per customer account when data is > compromised or exposed during a breach," she added. Litan recommended > encryption as the first step enterprises and government agencies should > take to protect customer/citizen data. If that's not feasible, > organizations should deploy host-based intrusion prevention systems, she > said, and/or conduct security audits to validate that the company or > agency has satisfactory controls in place." > http://www.techweb.com/wire/security/188702019 > > Or, Brian Chess once pointed out: > " My favorite historical analogy this month is from medicine: it took > *decades* between the time that researchers knew that fewer people died if > surgeons washed their hands and the time that antisepsis was common in the > medical community. That lag was entirely due to social factors: if it's > 1840 you've been successfully practicing medicine for decades, why would you > want to change your routine? And yet imagine a modern day surgeon who says > "I'm really busy today, so I'm going to save time by not scrubbing in before > I start the operation." It's simply unthinkable. Hopefully software > development is headed in the same direction, but on an accelerated > timetable." > > -gp > > > * > This communication, including attachments, is > for the exclusive use of addressee and may contain proprietary, > confidential and/or privileged information. If you are not the intended > recipient, any use, copying, disclosure, dissemination or distribution is > strictly prohibited. If you are not the intended recipient, please notify > the sender immediately by return e-mail, delete this communication and > destroy all copies. > * > > > ___ > Secure Coding mailing list (SC-L) > SC-L@securecoding.org > List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l > List charter available at - http://www.securecoding.org/list/charter.php ___
[SC-L] SOA and security
Hi all, You may recall the article that I wrote with Scott Matsumoto and Jeremy Epstein about SOA security for IEEE S&P magazine. I recently did an interview with a SOA guy that stemmed from that article. It's available here: http://soasecurityarchitect.com/2006/06/08/interview-with-gary-mcgraw-cto-of-cigital-inc.aspx gem Cigital www.cigital.com Software Security www.swsec.com Silver Bullet Podcast www.cigital.com/silverbullet This electronic message transmission contains information that may be confidential or privileged. The information contained herein is intended solely for the recipient and use by any other party is not authorized. If you are not the intended recipient (or otherwise authorized to receive this message by the intended recipient), any disclosure, copying, distribution or use of the contents of the information is prohibited. If you have received this electronic message transmission in error, please contact the sender by reply email and delete all copies of this message. Cigital, Inc. accepts no responsibility for any loss or damage resulting directly or indirectly from the use of this email or its contents. Thank You. ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php
RE: [SC-L] Comparing Scanning Tools
Several thoughts: 1. I love it when industry analysts are perceived as being credible by throwing out financial costs for things they really don't have visibility into. 2. The VA lost data not do secure coding techniques but an employee not following the rules on what data to take out of the building. 3. No industry analyst has ever attempted to quantify cost vs benefit of secure coding when compared to other constraints. The quantification to date has only been the cliche: it is cheaper to fix X earlier in the lifecycle rather than later in which X could be pretty much any system quality. -Original Message- From: Gunnar Peterson [mailto:[EMAIL PROTECTED] Sent: Thursday, June 08, 2006 9:28 AM To: McGovern, James F (HTSC, IT) Cc: Secure Mailing List Subject: Re: [SC-L] Comparing Scanning Tools Hi James, I think you are right to look at it as economic issue, but the other factor to add into your model is not just the short term impact to developer productivity (which is non-trivial), but also the long term effects of making decisions *not* to deal with finding bugs. "Cleaning up data breach costs more than encryption Protecting customer records is a much less expensive than paying for cleanup after a data breach or massive records loss, research company Gartner said. Gartner analyst Avivah Litan testified on identity theft at a Senate hearing held after the Department of Veterans Affairs lost 26.5 million vet identities. "A company with at least 10,000 accounts to protect can spend, in the first year, as little as $6 per customer account for just data encryption, or as much as $16 per customer account for data encryption, host-based intrusion prevention, and strong security audits combined," Litan said. "Compare [that] with an expenditure of at least $90 per customer account when data is compromised or exposed during a breach," she added. Litan recommended encryption as the first step enterprises and government agencies should take to protect customer/citizen data. If that's not feasible, organizations should deploy host-based intrusion prevention systems, she said, and/or conduct security audits to validate that the company or agency has satisfactory controls in place." http://www.techweb.com/wire/security/188702019 Or, Brian Chess once pointed out: " My favorite historical analogy this month is from medicine: it took *decades* between the time that researchers knew that fewer people died if surgeons washed their hands and the time that antisepsis was common in the medical community. That lag was entirely due to social factors: if it's 1840 you've been successfully practicing medicine for decades, why would you want to change your routine? And yet imagine a modern day surgeon who says "I'm really busy today, so I'm going to save time by not scrubbing in before I start the operation." It's simply unthinkable. Hopefully software development is headed in the same direction, but on an accelerated timetable." -gp * This communication, including attachments, is for the exclusive use of addressee and may contain proprietary, confidential and/or privileged information. If you are not the intended recipient, any use, copying, disclosure, dissemination or distribution is strictly prohibited. If you are not the intended recipient, please notify the sender immediately by return e-mail, delete this communication and destroy all copies. * ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php
RE: [SC-L] Comparing Scanning Tools
Hi All, Just a quick reminder that there is a chapter on code scanning technology and its application in "Software Security" (www.swsec.com). Don't forget that these tools are best used as aids to make a smart human more efficient. They do not replace the human, nor are they of much use among the clueless. Every commercial tool has its issues, but the free tools ITS4, RATS, and flawfinder are not worth using at all anymore given tool evolution of late. The chapter in "Software Security" discusses the history of these tools, how they actually work, and points to research in academia so you know where they're headed. There are also pointers to most of the commercial tools. We have found in our practice at Cigital that the most powerful applications of these tools involves developing specific and tailored coding guidelines for a given platform (say J2EE), building those guidelines to just so happen to cohere with security policy (shhh, tell no one), and then enforcing the guidelines by adding rules to a static analysis tool. Another tip: don't use the tools with all of the default rules all at once. Carefully turn rules on and off and feed the results into dev along with training. Use the tools as part of awareness and enforcement activities. gem Cigital www.cigital.com Software Security www.swsec.com Silver Bullet www.cigital.com/silverbullet -Original Message- From: [EMAIL PROTECTED] on behalf of [EMAIL PROTECTED] Sent: Wed 6/7/2006 4:34 PM To: [EMAIL PROTECTED] Cc: sc-l@securecoding.org Subject: Re: [SC-L] Comparing Scanning Tools | Date: Mon, 5 Jun 2006 16:50:17 -0400 | From: "McGovern, James F (HTSC, IT)" <[EMAIL PROTECTED]> | To: sc-l@securecoding.org | Subject: [SC-L] Comparing Scanning Tools | | The industry analyst take on tools tends to be slightly different than | software practitioners at times. Curious if anyone has looked at Fortify and | has formed any positive / negative / neutral opinions on this tool and | others... We evaluated a couple of static code scanning tools internally. The following is an extract from an analysis I did. I've deliberately omitted comparisons - you want to know about Fortify, not how it compares to other products (which raises a whole bunch of other issues), and included the text below. Standard disclaimers: This is not EMC's position, it's my personal take. Caveats: This analysis is based on a 3-hour vendor presentation. The presenter may have made mistakes, and I certainly don't claim that my recall of what he said is error-free. A later discussion with others familiar with Fortify indicated that the experience we had is typical, but is not necessarily the right way to evaluate the tool. Effective use of Fortify requires building a set of rules appropriate to a particular environment, method of working, constraints, etc., etc. This takes significant time (6 months to a year) and effort, but it was claimed that once you've put in the effort, Fortify is a very good security scanner. I am not in a position to evaluate that claim myself. BTW, one thing not called out below is that Fortify can be quite slow. Our experience in testing was that a Fortify scan took about twice as long as a C++ compile/link cycle, unless you add "data flow" analysis - in which case the time is much, much larger. The brief summary: In my personal view, Fortify is a worthwhile tool, but it would not be my first choice. (Given the opportunity to choose two tools, it would probably be my second.) Others involved in the evaluation reached the opposite conclusion, and rated Fortify first. -- Jerry Fortify Fortify is aimed as a tool for use in a security audit. It is deliberately biased in the direction of flagging all potential security issues. It provides two kinds of analysis - what they call "semantic" and "data flow". Neither use of terminology is consistent with industry practice. Their "semantic" analysis is better described as a "syntactic" analysis: It looks at surface features of the program (use of certain calls, for example). It mainly ignores context. Fortify's own representative describe this analysis as a "super grep". This analysis is driven by a large database of rules, which can be extended. (In industry practice, a semantic analysis would look deeply at the meaning of the program.) "Data flow" analysis is better called "taint analysis". It tr
Re: [SC-L] Comparing Scanning Tools
Hi James, I think you are right to look at it as economic issue, but the other factor to add into your model is not just the short term impact to developer productivity (which is non-trivial), but also the long term effects of making decisions *not* to deal with finding bugs. "Cleaning up data breach costs more than encryption Protecting customer records is a much less expensive than paying for cleanup after a data breach or massive records loss, research company Gartner said. Gartner analyst Avivah Litan testified on identity theft at a Senate hearing held after the Department of Veterans Affairs lost 26.5 million vet identities. "A company with at least 10,000 accounts to protect can spend, in the first year, as little as $6 per customer account for just data encryption, or as much as $16 per customer account for data encryption, host-based intrusion prevention, and strong security audits combined," Litan said. "Compare [that] with an expenditure of at least $90 per customer account when data is compromised or exposed during a breach," she added. Litan recommended encryption as the first step enterprises and government agencies should take to protect customer/citizen data. If that's not feasible, organizations should deploy host-based intrusion prevention systems, she said, and/or conduct security audits to validate that the company or agency has satisfactory controls in place." http://www.techweb.com/wire/security/188702019 Or, Brian Chess once pointed out: " My favorite historical analogy this month is from medicine: it took *decades* between the time that researchers knew that fewer people died if surgeons washed their hands and the time that antisepsis was common in the medical community. That lag was entirely due to social factors: if it's 1840 you've been successfully practicing medicine for decades, why would you want to change your routine? And yet imagine a modern day surgeon who says "I'm really busy today, so I'm going to save time by not scrubbing in before I start the operation." It's simply unthinkable. Hopefully software development is headed in the same direction, but on an accelerated timetable." -gp On 6/7/06 4:08 PM, "McGovern, James F (HTSC, IT)" <[EMAIL PROTECTED]> wrote: > Thanks for the response. One of the things that I have been struggling to > understand is not the importance of using such a tool as I believe they > provide value but more of the fact that these tools may not be financial > sustainable. > > Many large enterprises nowadays outsource development to third parties. > Likewise, the mindset in terms of budgeting tends to eschew "per developer > seat" tool purchases. Nowadays, it is rare to find an enterprise not using > free tools such as Eclipse and not paying for IDEs > > I have yet to find a large enterprise that has made a significant investment > in such tools. I wonder if budgets and the tools themselves are really causing > more harm than helping in that enterprises will now think about trading off > such tools vs the expense they cost. > > -Original Message- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > Sent: Wednesday, June 07, 2006 4:34 PM > To: McGovern, James F (HTSC, IT) > Cc: sc-l@securecoding.org > Subject: Re: [SC-L] Comparing Scanning Tools > > > | Date: Mon, 5 Jun 2006 16:50:17 -0400 > | From: "McGovern, James F (HTSC, IT)" <[EMAIL PROTECTED]> > | To: sc-l@securecoding.org > | Subject: [SC-L] Comparing Scanning Tools > | > | The industry analyst take on tools tends to be slightly different than > | software practitioners at times. Curious if anyone has looked at Fortify > and > | has formed any positive / negative / neutral opinions on this tool and > | others... > We evaluated a couple of static code scanning tools internally. The > following is an extract from an analysis I did. I've deliberately > omitted comparisons - you want to know about Fortify, not how it > compares to other products (which raises a whole bunch of other > issues), and included the text below. Standard disclaimers: This > is not EMC's position, it's my personal take. > > Caveats: This analysis is based on a 3-hour vendor presentation. The > presenter may have made mistakes, and I certainly don't claim that my > recall of what he said is error-free. A later discussion with others > familiar with Fortify indicated that the experience we had is typical, > but is not necessarily the right way to evaluate the tool. Effective > use of Fortify requires building a set of rules appropriate to a > particular environment, method of working, constraints, etc., etc. > This takes significant time (6 months to a year) and effort, but > it was claimed that once you've put in the effort, Fortify is a > very good security scanner. I am not in a position to evaluate that > claim myself. > > BTW, one thing not called out below is that Fortify can be quite slow. > Our experience in testing was that a Fortify scan took about twice as > long as a C++ compi