Re: [SC-L] QASEC Announcement: Writing Software Security Test Cases

2007-01-08 Thread J. M. Seitz
This is great, and something I have incorporated into our own cycle
previously, as carving out a spot on our team as the "security engineer"
didn't seem to work. But by creating a process for including security
testing, abuse cases, etc. I was able to incorporate security without a big
hit to the team. This sold management on the fact that it can be a simple
and seamless process and soon became adopted. The other half of it is that
you have to be the person on the team who always is thinking in terms of the
corner cases, the worst case scenarios, the one who aggravates the
development team the most.

I still find that at times you have to raise concerns and show
vulnerabilities by actually writing the POC exploits. An example of this
would be a portable encrypted filesystem that was used to protect data the
application used. No one understood that no matter how long the password
was, nor the 256 "bank level" encryption, the filesystem was still insecure
in it's _implementation_! By writing an exploit using DLL injection, some
exported function hooking and by outputting the password into a plaintext
file, the eyes of many were opened. Little did anyone know that part of
developing a secure application you have to do it not only in the code, but
it it's build process, deployment, etc. But once again the next time a major
flaw comes into light, you find yourself in the wee hours of the morning
writing a POC to prove just how bad it is.

The question to the list is: Can we ever get away from costly exploit
development? Has anyone developed techniques in reporting and disclosure
that allowed them to avoid a massive caffeine addiction?

JS

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of [EMAIL PROTECTED]
Sent: Sunday, January 07, 2007 11:49 AM
To: sc-l@securecoding.org
Subject: [SC-L] QASEC Announcement: Writing Software Security Test Cases

I've Just released an article about how the Quality Assurance phase of the
development cycle can incorporate security testing into a standard test
plan, and make it part of the regular testing cycle.

Writing Software Security Test Cases: Putting security test cases into your
test plan http://www.qasec.com/cycle/securitytestcases.shtml


- Robert
[EMAIL PROTECTED]
http://www.cgisecurity.com/
http://www.qasec.com/
http://www.webappsec.org/
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org List information,
subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Code Analysis Tool Bakeoff

2007-01-08 Thread John Steven
I think Gunnar hit a lot of the important points. Bake offs do  
provide interesting data. I have a few slide decks which I've created  
to help companies with this problem, and would be happy to provide  
them to anyone willing to email me side-channel. Of the items Gunnar  
listed, I find that baking off tools helps organizations understand  
where they're going to have to apply horsepower and money.

For instance, companies that purchase Coverity's Prevent seem to have  
little trouble getting penetration into their dev. teams, even beyond  
initial pilot.  Model tuning provides breeze-easy ability to keep  
'mostly effective' rules in play and still reduce false positives.  
However, with that ease of adoption and developer-driven results  
interpretation, orgs. buy some inflexibility in terms of later  
extensibility. Java support, now only in beta, lacks sorely and the  
mechanisms by which one writes custom checkers poses a stiff learning  
curve. Whereas, when one adopts Fortify's sourceAnalyzer, developer  
penetration will be _the_ problem unless the piloting team bakes a  
lot of rule tuning into the product's configuration and results  
pruning into the usable model prior to role out. However, later  
customization seems easiest of any of the tools I'm familiar with.  
Language and rules coverage seems, at the macro-level, consistently  
the most robust.

In contrast, it takes real experience to illuminate each tool's  
difference in the accuracy department. Only a bakeoff that contains  
_your_ organization's code can help cut through the fog of what each  
vendor's account manager will promise. The reason seems to be that  
the way a lot of these tools behave relative to each other  
(especially Prexis, K7, and Source Analyzer) depends greatly on  
minute details of how they implemented rules. However, at the end of  
the day, their technologies remain shockingly similar (at least as  
compared to products from Coverity, Secure Software, or Microsoft's  
internal Prefix).

For instance, in one bake off, we found that (with particular open  
source C code) Fortify's tool found more unique instances of  
overflows on stack-based, locally declared buffers, with offending  
locally declared length-specifiers. However, Klocwork's tool was  
profoundly more accurate in cases in which the overflow had similar  
properties but represented an 'off by one' error within a buffer  
declared as a fixed length array.

Discussing tradeoffs in tool implementation at this level leads  
bakers down a bevy of rabbit holes. Looking at them to the extent  
Cigital does, for deep understanding of our clients' code and how  
_exactly_ the tool is helping/hurting us isn't _your_ goal. But, by  
collecting data on 7 figures of your own code base, you can start to  
see what trends in your programmers' coding practices play to which  
tools. This, can in fact, help you make a better tool choice.


John Steven
Technical Director; Principal, Software Security Group
Direct: (703) 404-5726 Cell: (703) 727-4034
Key fingerprint = 4772 F7F3 1019 4668 62AD  94B0 AE7F
http://www.cigital.com
Software Confidence. Achieved.


On Jan 6, 2007, at 11:27 AM, Gunnar Peterson wrote:

>> 1. I haven't gotten a sense that a bakeoff matters. For example,  
>> if I wanted
>> to write a simple JSP application, it really doesn't matter if I  
>> use Tomcat,
>> Jetty, Resin or BEA from a functionality perspective while they  
>> may each have
>> stuff that others don't, at the end of the day they are all good  
>> enough. So is
>> there really that much difference in comparing say Fortify to  
>> OunceLabs or
>> whatever other tools in this space exist vs simply choosing which  
>> ever one
>> wants to cut me the best deal (e.g. site license for $99 a year :-) ?
>>
>
> I recommend that companies do a bakeoff to determine
>
> 1. ease of integration with dev process - everyone's dev/build  
> process is
> slightly different
>
> 2. signal to noise ratio - is the tool finding high priority/high  
> impact
> bugs?
>
> 3.  remediation guidance - finding is great, fixing is better, how
> actionable and relevant is the remediation guidance?
>
> 4. extensibility - say you have a particular interface, like mq  
> series for
> example, which has homegrown authN and authZ foo that you want to  
> use the
> static analysis to determine if it is used correctly. How easy is it
> build/check/enfore these rules?
>
> 5. roles - how easy is it to separate out roles/reports/ 
> functionaility like
> developer, ant jockey, and auditor?
>
> 6. software architecture span - your high risk/high priority apps are
> probably multi-tier w/ lots of integration points, how much  
> visibility to
> how many integration points and tiers does the static analysis tool  
> allow
> you to see? How easy is it to correlate across tiers and interfaces?
>




This electronic message transmission contains

[SC-L] Magazines

2007-01-08 Thread McGovern, James F (HTSC, IT)
I learned through the grapevine that folks from Network Computing will be doing 
an upcoming article and comparison of tools in the secure coding space. If you 
are a vendor, it would be wise to make sure your marketing folks are 
participating. The funny thing is that I wouldn't expect it to appear in such a 
publication. Having in the past written for Java Developers Journal, I wouldn't 
be against pitching the writing of a similar story if vendors here would be 
game as well.


*
This communication, including attachments, is
for the exclusive use of addressee and may contain proprietary,
confidential and/or privileged information.  If you are not the intended
recipient, any use, copying, disclosure, dissemination or distribution is
strictly prohibited.  If you are not the intended recipient, please notify
the sender immediately by return e-mail, delete this communication and
destroy all copies.
*

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] QASEC Announcement: Writing Software Security Test Cases

2007-01-08 Thread bugtraq
> This is great, and something I have incorporated into our own cycle
> previously, as carving out a spot on our team as the "security engineer"
> didn't seem to work. But by creating a process for including security
> testing, abuse cases, etc. I was able to incorporate security without a big
> hit to the team. This sold management on the fact that it can be a simple
> and seamless process and soon became adopted. The other half of it is that
> you have to be the person on the team who always is thinking in terms of the
> corner cases, the worst case scenarios, the one who aggravates the
> development team the most.


The fact of proving to management that this isn't an expensive decision is 
something that I think will start to catch on. By making this part of the 
process
if an issue is discovered you have already scoped out that additional time 
needed to
research and address the issue. QA has always aggravated development this isn't 
new :) 

Regards,
- Robert
http://www.cgisecurity.com/
http://www.qasec.com/



___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___