We are having a good thread going on fuzzing, commercial tools, etc. on the fuzzing list. This is a large forward but I thought some of you might want to weigh in, or at least take a look at the thread.
JS Hello all, Although we at Codenomicon do not "fuzz" in the true meaning of the word (that depends on the definition), I would like to comment on these issues Charlie brought up. > Date: Tue, 07 Nov 2006 08:28:26 -0600 > From: Charlie Miller <[EMAIL PROTECTED]> > > My take on this is that any type of data that is read in and parsed by > an application can be fuzzed. Yes, and I suppose most of these have been tried. Fuzzing (or any type of black box testing) is possible for any interfaces whether they are APIs, network protocols, wireless stacks, GUI, files, return values, ... Even we at Codenomicon already cover more than 100 different interfaces with robustness tests... > I also think that fuzzing can only find certain types of > vulnerabilities, i.e. relatively simple memory corruption bugs. This is not true. You can easily make a study on this. Take any protocol and all vulnerabilities found in the implementations of that protocol, and map that to the test coverage of black-box tools such as fuzzers. That would be an interesting comparison! > Luckily, there are plenty of these [bugs] around. True, and that is why intelligence is not often required from fuzzing tools. Heck you can crash most network devices by just sending /dev/random to them. ;) > Good luck finding a command injection vulnerability or a bug that > requires three different simultaneous anomalies. Well, this is a really good comment, and the reason why I could not resist commenting on this thread! Why would you want to involve luck in the equation? We at Codenomicon/PROTOS have noted that careful test design will change luck and skill into engineering practise. With file fuzzers for example it is easy to generate millions of "tests" but with systematic testing you will still find most of these flaws and more. Being able to optimize millions of tests into tens of thousands without compromising test coverage is the goal. And it is also a requirement for many testers. The combinations of anomalies is a bigger issue. I know (and even during PROTOS we found these) that there are flaws that require combination of two or three anomalies, and those where two different messages need to be sent in a specific order. But when the tests are optimized in number, this is made easier also. We cannot test all three-field combinations, but in the real life we do not have to either. I would look forward to hearing if anyone has an example vulnerability in mind that is not covered by Codenomicon tools. Please nothing from proprietary protocols as I would not be able to disclose the fact if we cover it or not. ;) > I think smart researchers, like these guys, move on to fuzzing new > types of data, be it new protocols, file types, etc. This is why I think general purpose fuzzing frameworks like PROTOS mini-simulation engine (first launched in 1999 but not publicly available) and GPF (by DeMott) are so powerful. Basically we will never run out of protocols, interface specifications, use cases, and traffic captures... > It doesn't make a lot of sense to fuzz the HTTP protocol against IIS > at this point, as very many people have done this with a number of > tools. Oh definitely it does make sense. All products are full of flaws. You just need to build more intelligence to the tests. Even though companies like Codenomicon do not ever disclose any flaws, it does not mean that these flaws do not exist. > Based on the success of this project, I'm guessing they are the first > ones to seriously try fuzzing filesystems. As far as I know, all commercial fuzzers support testing of file systems... Software companies are just not interested in PAYING for security when they can get it for free... ;) So blame the software developers, not the tool vendors... > After those bugs are shaken out, we'll move on to the next type of > data. Oh you do not need to move forward. How about just taking a fuzzer from 1999 such as the WAP test tools from PROTOS or from @Stake, and you will discover that everything is still broken. That is the problem with the industry. Test it once, and after few years everything is back to where it was. But just using tools from other people is not interesting, is it. People want to find new stuff to make them famous? ;) > This is reminiscent of when everyone fuzzed network protocols and then > someone started fuzzing file types. Again, Codenomicon had file format fuzzers before anyone was aware of that risk. And we had lots of problems developing those tools as the development environments kept crashing all the time (I am not naming any OS products here). But again the industry was not ready for our tools... They needed to learn it the bad way. Thanks to all who contributed! ;) > If I knew what the next new thing to fuzz was, I'd be doing it right > now :) I can tell some hints, but that would not fix the real problem. I would be extremely interested in hearing why all you readers do fuzzing? Have you thought about it? What are we trying to improve? I think the real problem is how we could fix the software processes so that fuzzing would be part of it. How could we make the tools such that the industry would also adapt them into their development practices? Where should "fuzzing" be used in the software engineering process? Who is responsible for "fuzzing"? I think these are more important questions rather than looking at the next avenue of fame for hackers... It is too bad everyone is hunting for bugs rather than focusing on the usage scenarios of the tools. > Date: Tue, 7 Nov 2006 08:34:43 -0600 (CST) > From: Gadi Evron <[EMAIL PROTECTED]> Hello Gadi, > 2. In my opinion, fuzzing IIS or Apache may be very difficult, but > still interesting. HTTP fuzzing still has a lot of uses with other > tools, tools in development, etc. You are correct. Actually I have seen several Apache installations fail under tests with Codenomicon HTTP test tool. It is not enough to test it in R&D. When you integrate Apache into a device, or into a web portal, the sum of the components is more complex than what was expected. Every compiler option and every modification could introduce (or reveal) new bugs. New flaws are still found in Apache, and many of these could be found with fuzzing. I know most of the recent flaws are in the extensions. And some of these can be fixed in the web applications themselves (and are fixed there leaving other deployments vulnerable), or in the configuration options, but still many of these can be considered flaws in Apache. Testing is never "done". Again we at Codenomicon are looking for ways of helping the open source community have access to our test tools. Please let me know if you have any "fuzzing-aware" research projects in this area, and I will see if there is anything I can do to help. I do not wish to discuss commercial products so I will leave IIS alone. Also if anyone made it this far in my ramblings, I would look forward if someone would be interested and ready to do some neutral third party comparison testing between different fuzzing tools. I know many of you have already tried our tools... Contact me if you are interested. Good work everyone! And best regards from everyone at Codenomicon! /Ari -- -o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o- Ari Takanen Codenomicon Ltd. [EMAIL PROTECTED] Tutkijantie 4E tel: +358-40 50 67678 FIN-90570 Oulu http://www.codenomicon.com Finland PGP: http://www.codenomicon.com/codenomicon-key.asc -o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o- _______________________________________________ fuzzing mailing list [EMAIL PROTECTED] http://www.whitestar.linuxbox.org/mailman/listinfo/fuzzing _______________________________________________ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. _______________________________________________