Re: [SC-L] Static Vs. Binary

2009-08-04 Thread John Steven
Pravir,

HA!  :D

(Knowing me, you can predict what I’m about to say)

YES,  explaining what the tools will need to do correctly as they continue 
their next-generation isn’t useful to a practitioner on this list today.

 ...

But, it is very important  to understand-as a practitioner-what your tools 
aren’t taking into account accurately; many organizations do little else than 
triage and report on tool results. For instance, when a particular tools says 
it supports a technology (Such as Spring, or SpringMVC) what does that mean? 
Weekly, our consultants  augment a list of things the [commercial tool they’re 
using that day] doesn’t do because it doesn’t ‘see’ a config file, a property, 
some aspect that would have been present in the binary, (even the source code) 
etc...

I’ll accept that my advice being targeted at the tool vendors themselves isn’t 
very useful by consumers of this list (it is for your new company though eh?), 
but I think it is important as a security practitioner, if you’re building an 
assurance program within your org., to understand what the tools/techniques 
you’re finding (or disproving) the existence of within your applications’ code 
bases. This will include what their notion of ‘binary’ is increasingly, as list 
participants begin to consume vendor SAAS static analysis of binary services.


John Steven
Senior Director; Advanced Technology Consulting
Direct: (703) 404-5726 Cell: (703) 727-4034
Key fingerprint = 4772 F7F3 1019 4668 62AD  94B0 AE7F EEF4 62D5 F908

Blog: http://www.cigital.com/justiceleague
Papers: http://www.cigital.com/papers/jsteven

http://www.cigital.com
Software Confidence. Achieved.


On 7/30/09 10:57 PM, "Pravir Chandra"  wrote:

First, I generally agree that there are many factors that make the true and 
factual fidelity of static analysis really REALLY difficult.

However, I submit that by debating this point, you're belaboring the correct 
angle of survivable Neptunian atmospheric entry with people that don't 
generally value the benefit of flying humans past the moon.

The point being, if you're debating the minutiae of static analysis vis-a-vis 
compile time optimizations, you're convincing people to let good be the enemy 
of perfect. There are few (if any) perfect technologies, but we use them 
because they're needed and provide a ton of great value. Anyone who doubts this 
should glance at the device you're reading this on and imagine yourself 
refusing to use it because it doesn't have perfect security (or reliability, or 
usability, etc.).

-Original Message-
From: John Steven 

Something occurred to me last night as I pondered where this discussion¹s
tendrils are taking us.

An point I only made implicitly is this: The question wrote:

> All,
>
> The question of ³Is my answer going to be high-enough resolution to support
> manual review?² or ³...to support a developer fixing the problem?² comes down
> to ³it depends².  And, as we all know, I simply can¹t resist an ³it depends²
> kind of subtlety.
>
> Yes, Jim, if you¹re doing a pure JavaSE application, and you don¹t care about
> non-standards compilers (jikes, gcj, etc.), then the source and the binary are
> largely equivalent (at least in terms of resolution) Larry mentioned gcj.
> Ease of parsing, however, is a different story (for instance, actual
> dependencies are way easier to pull out of a binary than the source code,
> whereas stack-local variable names are easiest in source).
>
> Where you care about ³a whole web application² rather than a pure-Java module,
> you have to concern yourself with JSP and all the other MVC technologies.
> Placing aside the topic of XML-based configuration files, you¹ll want to know
> what (container) your JSPs were compiled to target. In this case, source code
> is different than binary. Similar factors sneak themselves in across the Java
> platform.
>
> Then you¹ve got the world of Aspect Oriented programming. Spring and a broader
> class of packages that use AspectJ to weave code into your application will
> dramatically change the face of your binary. To get the same resolution out of
> your source code, you must in essence Oapply¹ those point cuts yourself...
> Getting binary-quality resolution from source code  therefore means predicting
> what transforms will occur at what point-cut locations. I doubt highly any
> source-based approach will get this thoroughly correct.
>
> Finally, from the perspective of dynamic analysis, one must consider the
> post-compiler transforms that occur. Java involves both JIT and Hotspot (using
> two hotspot compilers: client and server, each of which conducting different
> transforms), which neither binary nor source-code-based static analysis are
> likely to correctly predict or account for. The binary image that runs is
> simply not that which is fed to classloader.defineClass[] as a bytestream.
>
> ...and  (actually) finally, one of my favorite code-review techniques is to
> ask for both a .war/ear/jar

Re: [SC-L] IBM Acquires Ounce Labs, Inc.

2009-08-04 Thread Chris Wysopal

I wouldn't say that NTO Spider is a "sort of" dynamic web scanner. It is a top 
tier scanner that can battle head to head on false negative rate with the big 
conglomerates' scanners: IBM AppScan and HP WebInspect.  Larry Suto published 
an analysis a year ago, that certainly had some flaws (and was rightly 
criticized), but genuinely showed all three to be in the same league. I haven't 
seen a better head-to-head analysis conducted by anyone. A little bird 
whispered to me that we may see a new analysis by someone soon. 

As a group of security practitioners it is amazing to me that we don't have 
more quantifiable testing and tools/services are just dismissed with anecdotal 
data.  I am glad NIST SATE '09 will soon be underway and, at least for static 
analysis tools, we will have unbiased independent testing. I am hoping for a 
big improvement over last year.  I especially like the category they are using 
for some flaws found as "valid but insignificant". Clearly they are improving 
based on feedback from SATE '08.

Veracode was the first company to offer static and dynamic (web) analysis, and 
we have been for 2 years (announced Aug 8, 2007).  We deliver it as a service. 
If you have a .NET or Java web app, you would cannot find a comparable solution 
form a single vendor today.

-Chris

-Original Message-
From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Arian J. Evans
Sent: Tuesday, July 28, 2009 1:41 PM
To: Matt Fisher
Cc: Kenneth Van Wyk; Secure Coding
Subject: Re: [SC-L] IBM Acquires Ounce Labs, Inc.

Right now, officially, I think that is about it. IBM, Veracode, and
AoD (in Germany) claims they have this too.

As Mattyson mentioned, Veracode only does static binary analysis (no
source analysis). They offer "dynamic scanning" but I believe it is
using NTO Spider IIRC which is a simplified scanner that targets
unskilled users last I saw it.

At one point I believe Veracode was in discussions with SPI to use WI,
but since the Veracoders haunt this list I'll let them clarify what
they use if they want.

So IBM: soon.

Veracode: sort-of.

AoD: on paper

And more to come in short order no doubt. I think we all knew this was
coming sooner or later. Just a matter of "when".

The big guys have a lot of bucks to throw at this problem if they want
to, and pull off some really nice integrations. Be interesting to see
what they do, and how useful the integrations really are to
organizations.

-- 
Arian Evans





On Tue, Jul 28, 2009 at 9:29 AM, Matt Fisher wrote:
> Pretty much. Hp /spi has integrations as well but I don't recall devinspect 
> ever being a big hit.  Veracode does both as well as static binary but as 
> asaas model. Watchfire had a RAD integration as well iirc but it clearly must 
> not haved had the share ounce does.
>
> -Original Message-
> From: Prasad Shenoy 
> Sent: July 28, 2009 12:22 PM
> To: Kenneth Van Wyk 
> Cc: Secure Coding 
> Subject: Re: [SC-L] IBM Acquires Ounce Labs, Inc.
>
>
> Wow indeed. Does that makes IBM the only vendor to offer both Static
> and Dynamic software security testing/analysis capabilities?
>
> Thanks & Regards,
> Prasad N. Shenoy
>
> On Tue, Jul 28, 2009 at 10:19 AM, Kenneth Van Wyk wrote:
>> Wow, big acquisition news in the static code analysis space announced today:
>>
>> http://news.prnewswire.com/DisplayReleaseContent.aspx?ACCT=104&STORY=/www/story/07-28-2009/0005067166&EDATE=
>>
>>
>> Cheers,
>>
>> Ken
>>
>> -
>> Kenneth R. van Wyk
>> KRvW Associates, LLC
>> http://www.KRvW.com
>>
>> (This email is digitally signed with a free x.509 certificate from CAcert.
>> If you're unable to verify the signature, try getting their root CA
>> certificate at http://www.cacert.org -- for free.)
>>
>>
>>
>>
>>
>>
>> ___
>> Secure Coding mailing list (SC-L) SC-L@securecoding.org
>> List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
>> List charter available at - http://www.securecoding.org/list/charter.php
>> SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
>> as a free, non-commercial service to the software security community.
>> ___
>>
>>
> ___
> Secure Coding mailing list (SC-L) SC-L@securecoding.org
> List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
> List charter available at - http://www.securecoding.org/list/charter.php
> SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
> as a free, non-commercial service to the software security community.
> ___
>
> ___
> Secure Coding mailing list (SC-L) SC-L@securecoding.org
> List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
> List charter available at - http://www.securecoding.org/list/charter.php
> SC-L is hosted and 

Re: [SC-L] IBM Acquires Ounce Labs, Inc.

2009-08-04 Thread Arian J. Evans
Chris -- Good point with Larry's paper. NTO Spider is, by design, a
simplified scanner for unskilled users, and I do not think it was
designed to be an effective tool for deep dynamic analysis of a web
application. It is, however, probably the best scanner on the market
for people who don't have the time or skill to configure dynamic
testing tools for their applications!

Larry Suto's paper reinforces this use-case desktop webapp scanners:

"Each scanner was run in default mode and not tuned in any capacity to
the application. The
importance of this lies in how effective the default mode so that
scalability of scanning is not limited
by manual intervention and setup procedures which can be very time
consuming. Second, in most
cases it is simply unrealistic to spend much time with many applications."


While I definitely look forward to more objective data on this
subject, I do not think Suto's report is a good example of how
consultants or SaaS providers deliver "expert security analysis" when
they use dynamic testing automation IMO. (I know this is not how I
ever did things.)

I think anyone who has experience with deep dynamic testing knows they
need automation tools with custom configuration ability, the ability
to record workflow, a framework to create custom tests, etc. I do not
believe NTO spider offers any of these essential features. I believe
it was explicitly designed for unskilled users' use-cases. (A valid
and important market to be sure.)

Admittedly I could be very wrong here. The NTO guys are sharp folks
and I haven't seen Spider in a while.


> As a group of security practitioners it is amazing to me that we don't have 
> more quantifiable testing and tools/services are just dismissed with 
> anecdotal data.

Completely agreed. I prefer to back up my statements about dynamic
tools with hard, quantifiable data. Deprived of that, I tend to rely
on historical experience, if it is a subject I have enough experience
on within that problem domain.

If you recall I used to do extensive testing and benchmarking of
dynamic testing tools across custom widgets I wrote, and production
enterprise applications, and publish them @OWASP and NIST conferences,
and "HE: Webapps 2nd Ed". Back then the quality of the tools was very
volatile, and changed significantly every release, and from
application to application you would test, so by the time you vetted
all your data, it was almost obsolete. Again the importance of
customizable, manually guided tools, when dealing with new and bespoke
applications.

I tried to tackle static analysis but became overwhelmed by the
challenge of setting up effective labs, and the huge array of static
analysis tools that were available.

Given that I now work on a dynamic testing platform: it would be
completely fair to accuse me of being "non-objective" when discussing
various vendors dynamic testing tools -- and I would have to agree
with you. It won't make my statements any less valid, but I have to
throw that out there to be fair.


Ultimately you hit the need for objective data spot-on. I would be
lying if I didn't say that I would LOVE to see more head-on
benchmarking between static analysis technology vendors like Veracode,
Fortify, Ounce, Coverity, Klockwork, etc. etc.

The problem I had in the past with benchmarks was the huge degree of
customization in each application I would test. While patterns emerge
that are almost always automatable to some degree, the technologies
almost always require hand care-and-feeding to get them to an
effective place. I think this notion of combining the tools with
qualified users is the true potential power of the SaaS solutions that
are coming to market.

I look forward to seeing the release of more objective analysis by
smarter minds than I, and am very impressed with how far things have
come since the simple tests I tried to run over the years.

$0.02. Cheers,

-- 
Arian Evans




On Tue, Aug 4, 2009 at 5:54 PM, Chris Wysopal wrote:
>
> I wouldn't say that NTO Spider is a "sort of" dynamic web scanner. It is a 
> top tier scanner that can battle head to head on false negative rate with the 
> big conglomerates' scanners: IBM AppScan and HP WebInspect.  Larry Suto 
> published an analysis a year ago, that certainly had some flaws (and was 
> rightly criticized), but genuinely showed all three to be in the same league. 
> I haven't seen a better head-to-head analysis conducted by anyone. A little 
> bird whispered to me that we may see a new analysis by someone soon.
>
> As a group of security practitioners it is amazing to me that we don't have 
> more quantifiable testing and tools/services are just dismissed with 
> anecdotal data.  I am glad NIST SATE '09 will soon be underway and, at least 
> for static analysis tools, we will have unbiased independent testing. I am 
> hoping for a big improvement over last year.  I especially like the category 
> they are using for some flaws found as "valid but insignificant". Clearly 
> they ar

Re: [SC-L] IBM Acquires Ounce Labs, Inc.

2009-08-04 Thread Arian J. Evans
Great answer, John. I especially like your point about web.xml.

This goes dually for black-box testing. There would be a lot of
advantage to being able to get (and compare) these types of config
files today for dialing in BBB (Better Black Box vs. blind black box)
testing. I don't think anyone is doing this optimally now. I know I am
eager to find static analysis that can provide/guide my BBB testing
with more context. I definitely think we will see more of these
combined-services evolve in the future. It only makes sense,
especially given some of the context-sensitive framing considerations
in your response.

Thanks for the solid thoughts,

-- 
Arian Evans





On Wed, Jul 29, 2009 at 5:44 AM, John Steven wrote:
> All,
>
> The question of "Is my answer going to be high-enough resolution to support 
> manual review?" or "...to support a developer fixing the problem?" comes down 
> to "it depends".  And, as we all know, I simply can't resist an "it depends" 
> kind of subtlety.
>
> Yes, Jim, if you're doing a pure JavaSE application, and you don't care about 
> non-standards compilers (jikes, gcj, etc.), then the source and the binary 
> are largely equivalent (at least in terms of resolution) Larry mentioned 
> gcj.  Ease of parsing, however, is a different story (for instance, actual 
> dependencies are way easier to pull out of a binary than the source code, 
> whereas stack-local variable names are easiest in source).
>
> Where you care about "a whole web application" rather than a pure-Java 
> module, you have to concern yourself with JSP and all the other MVC 
> technologies. Placing aside the topic of XML-based configuration files, 
> you'll want to know what (container) your JSPs were compiled to target. In 
> this case, source code is different than binary. Similar factors sneak 
> themselves in across the Java platform.
>
> Then you've got the world of Aspect Oriented programming. Spring and a 
> broader class of packages that use AspectJ to weave code into your 
> application will dramatically change the face of your binary. To get the same 
> resolution out of your source code, you must in essence 'apply' those point 
> cuts yourself... Getting binary-quality resolution from source code  
> therefore means predicting what transforms will occur at what point-cut 
> locations. I doubt highly any source-based approach will get this thoroughly 
> correct.
>
> Finally, from the perspective of dynamic analysis, one must consider the 
> post-compiler transforms that occur. Java involves both JIT and Hotspot 
> (using two hotspot compilers: client and server, each of which conducting 
> different transforms), which neither binary nor source-code-based static 
> analysis are likely to correctly predict or account for. The binary image 
> that runs is simply not that which is fed to classloader.defineClass[] as a 
> bytestream.
>
> ...and  (actually) finally, one of my favorite code-review techniques is to 
> ask for both a .war/ear/jar file AND the source code. This almost invariable 
> get's a double-take, but it's worth the trouble. How many times do you think 
> a web.xml match between the two? What exposure might you report if they were  
> identical? ... What might you test for If they're dramatically different?
>
> Ah... Good times,
> 
> John Steven
> Senior Director; Advanced Technology Consulting
> Direct: (703) 404-5726 Cell: (703) 727-4034
> Key fingerprint = 4772 F7F3 1019 4668 62AD  94B0 AE7F EEF4 62D5 F908
>
> Blog: http://www.cigital.com/justiceleague
> Papers: http://www.cigital.com/papers/jsteven
>
> http://www.cigital.com
> Software Confidence. Achieved.
>
>
> On 7/28/09 4:36 PM, "ljknews"  wrote:
>
> At 8:39 AM -1000 7/28/09, Jim Manico wrote:
>
>> A quick note, in the Java world (obfuscation aside), the source and
>> "binary" is really the same thing. The fact that Fortify analizes
>> source and Veracode analizes class files is a fairly minor detail.
>
> It seems to me that would only be true for those using a
> Java bytecode engine, not those using a Java compiler that
> creates machine code.
>
> ___
> Secure Coding mailing list (SC-L) SC-L@securecoding.org
> List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
> List charter available at - http://www.securecoding.org/list/charter.php
> SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
> as a free, non-commercial service to the software security community.
> ___
>

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___