Pravir,

HA!  :D

(Knowing me, you can predict what I’m about to say)

YES,  explaining what the tools will need to do correctly as they continue 
their next-generation isn’t useful to a practitioner on this list today.

 ...

But, it is very important  to understand-as a practitioner-what your tools 
aren’t taking into account accurately; many organizations do little else than 
triage and report on tool results. For instance, when a particular tools says 
it supports a technology (Such as Spring, or SpringMVC) what does that mean? 
Weekly, our consultants  augment a list of things the [commercial tool they’re 
using that day] doesn’t do because it doesn’t ‘see’ a config file, a property, 
some aspect that would have been present in the binary, (even the source code) 
etc...

I’ll accept that my advice being targeted at the tool vendors themselves isn’t 
very useful by consumers of this list (it is for your new company though eh?), 
but I think it is important as a security practitioner, if you’re building an 
assurance program within your org., to understand what the tools/techniques 
you’re finding (or disproving) the existence of within your applications’ code 
bases. This will include what their notion of ‘binary’ is increasingly, as list 
participants begin to consume vendor SAAS static analysis of binary services.

----
John Steven
Senior Director; Advanced Technology Consulting
Direct: (703) 404-5726 Cell: (703) 727-4034
Key fingerprint = 4772 F7F3 1019 4668 62AD  94B0 AE7F EEF4 62D5 F908

Blog: http://www.cigital.com/justiceleague
Papers: http://www.cigital.com/papers/jsteven

http://www.cigital.com
Software Confidence. Achieved.


On 7/30/09 10:57 PM, "Pravir Chandra" <chan...@list.org> wrote:

First, I generally agree that there are many factors that make the true and 
factual fidelity of static analysis really REALLY difficult.

However, I submit that by debating this point, you're belaboring the correct 
angle of survivable Neptunian atmospheric entry with people that don't 
generally value the benefit of flying humans past the moon.

The point being, if you're debating the minutiae of static analysis vis-a-vis 
compile time optimizations, you're convincing people to let good be the enemy 
of perfect. There are few (if any) perfect technologies, but we use them 
because they're needed and provide a ton of great value. Anyone who doubts this 
should glance at the device you're reading this on and imagine yourself 
refusing to use it because it doesn't have perfect security (or reliability, or 
usability, etc.).

-----Original Message-----
From: John Steven <jste...@cigital.com>

Something occurred to me last night as I pondered where this discussion¹s
tendrils are taking us.

An point I only made implicitly is this: The question<for years<has been
³conduct your SA on source code or binary?². You can see that there are
interesting subtleties in even those languages that target intermediate
representational formats (like Java and the .NET family of languages that
compiles to MSIL). The garbage-collection-optimization problems that plague
those asking ³How do I assure password String cleanup in Java² are of the
same ilk as the gcc optimizations that trouble the C/C++ realm.

Yes, this question is still pertinent. It_is_ interesting to those looking
for thorough/sound analysis to consider fidelity and resolution at this
level. People are beginning to echo what I've been saying for years, "this
problem extends beyond the initial compile into the runtime optimizations
and runtime compilers". My previous post reiterates that there's a lot more
to it than most people consider.

I think I allowed that clarification to muddle my more strategic point:

   -----------------
    Whereas THE question used to be source code vs. binary representation,
    the question is NOW: "What set of IOC-container/XML combos,
    aspect weaver results, method/class-level annotations, and other such
    tomfoolery governs the execution of my application beyond what the
    compiler initially output?
   -----------------

As Fortify, Veracode, and others punch out this 'static analysis on binaries
via SAAS' battle, they and the organizations they serve would do well to
keep this question in mind... Or risk the same failures that the current
crop of parser-based static-analysis tools face against dynamic approaches.

On 7/29/09 8:44 AM, "John Steven" <jste...@cigital.com> wrote:

> All,
>
> The question of ³Is my answer going to be high-enough resolution to support
> manual review?² or ³...to support a developer fixing the problem?² comes down
> to ³it depends².  And, as we all know, I simply can¹t resist an ³it depends²
> kind of subtlety.
>
> Yes, Jim, if you¹re doing a pure JavaSE application, and you don¹t care about
> non-standards compilers (jikes, gcj, etc.), then the source and the binary are
> largely equivalent (at least in terms of resolution).... Larry mentioned gcj.
> Ease of parsing, however, is a different story (for instance, actual
> dependencies are way easier to pull out of a binary than the source code,
> whereas stack-local variable names are easiest in source).
>
> Where you care about ³a whole web application² rather than a pure-Java module,
> you have to concern yourself with JSP and all the other MVC technologies.
> Placing aside the topic of XML-based configuration files, you¹ll want to know
> what (container) your JSPs were compiled to target. In this case, source code
> is different than binary. Similar factors sneak themselves in across the Java
> platform.
>
> Then you¹ve got the world of Aspect Oriented programming. Spring and a broader
> class of packages that use AspectJ to weave code into your application will
> dramatically change the face of your binary. To get the same resolution out of
> your source code, you must in essence Oapply¹ those point cuts yourself...
> Getting binary-quality resolution from source code  therefore means predicting
> what transforms will occur at what point-cut locations. I doubt highly any
> source-based approach will get this thoroughly correct.
>
> Finally, from the perspective of dynamic analysis, one must consider the
> post-compiler transforms that occur. Java involves both JIT and Hotspot (using
> two hotspot compilers: client and server, each of which conducting different
> transforms), which neither binary nor source-code-based static analysis are
> likely to correctly predict or account for. The binary image that runs is
> simply not that which is fed to classloader.defineClass[] as a bytestream.
>
> ...and  (actually) finally, one of my favorite code-review techniques is to
> ask for both a .war/ear/jar file AND the source code. This almost invariable
> get¹s a double-take, but it¹s worth the trouble. How many times do you think a
> web.xml match between the two? What exposure might you report if they were
> identical? ... What might you test for If they¹re dramatically different?

>
> On 7/28/09 4:36 PM, "ljknews" <ljkn...@mac.com> wrote:
>
>> At 8:39 AM -1000 7/28/09, Jim Manico wrote:
>>
>>> A quick note, in the Java world (obfuscation aside), the source and
>>> "binary" is really the same thing. The fact that Fortify analizes
>>> source and Veracode analizes class files is a fairly minor detail.
>>
>> It seems to me that would only be true for those using a
>> Java bytecode engine, not those using a Java compiler that
>> creates machine code.

_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________

Reply via email to