Achim, On Fri, Feb 6, 2009 at 5:43 AM, Achim Hoffmann <a...@securenet.de> wrote: > On Thu, 5 Feb 2009, Andres Riancho wrote: > > !! And checking if the response was different; but... all this > !! thinking wasn't in useles! What I want to do now is to create a new > !! plugin, that tries to find new parameters for a given php/asp/etc > !! script. > > This is i.g. a good idea and should be part of the crawler.
It will be a separate plugin, not part of the crawler... well... all the discovery plugins together create "one bad ass crawler" so, you're right =) > !! In some pentests I've performed, you find a script named > !! "upload.php" using nikto or something, but you don't know what > !! parameters to pass to it in order to really upload the file, so... you > !! start trying with file, filename, filecontent, f, upload, etc. > > LOL, this is a mess and done ba (all?) most scanners, even the high-priced > commercial ones. It's a good example where a human always beets the program:) w3af doesn't want to beat humans; also, if I could make w3af as good as 1/3 of the pentesters, I would get beaten by them because they wouldn't have a job ;) > !! What I > !! want to do is to automate all this process, and for every URL that > !! w3af finds, try a combination of thousands of parameters and check if > !! the response changes; this can be performed in a fast way like this: > !! > > So far, so good, but ... > > !! 1) Perform two GETs to the original URL, > !! http://www.example.com/index.php?id=1 and save the two responses. We > !! perform this step in order to make sure that the result of the URL > !! doesn't change randomly, and if it changes we know how much it > !! changes. > !! > !! 2) We should have a list of common variable names, and we should test > !! them all... but testing them one per request would be painfully slow, > !! so we could do something like this: > !! > !! GET > http://www.example.com/index.php?a=<rand_value>&b=<rand_value>....&z=<rand_value> > !! GET > http://www.example.com/index.php?aa=<rand_value>&bb=<rand_value>....&zz=<rand_value> > !! GET > http://www.example.com/index.php?aaa=<rand_value>&bbb=<rand_value>....&zzz=<rand_value> > !! GET > http://www.example.com/index.php?admin=<rand_value>&login=<rand_value>....&request=<rand_value> > > .. this is a bad idea, IMHO. > I.g. you have to go the painfull way and test only one parameter with all > payloads, then go to the next parameter and do the same. > Prerequest is that the inital submit to that form contains valid (in context > of the application) values for all form parameters. > Just think of an URL like: > > http://test.tld/app.whatever?id=42&type=text&sid=dont_change Oh, yes, I just realized that in my "bruteforcing" request I failed to include the original "id" parameter! That would surely raise false positives and create bugs. So, bruteforcing will be done like this: GET http://www.example.com/index.php?id=1&a=<rand_value>&b=<rand_value>....&z=<rand_value> ... GET http://www.example.com/index.php?id=1&admin=<rand_value>&login=<rand_value>....&request=<rand_value> > !! The performance is incremented a lot, because in one request we > !! are testing more than 20 variables. There is a length limit in the > !! query string that we should take care of, but it would still be 20 to > !! 50 times faster that one variable per request. Well... once we get all > !! the results, we compare them with the original ones, and if something > !! changes then we know that one of the variables is used for something. > !! Then a simple "binary search algorithm" will find which variable was > !! the lucky one. > > For a "simple" fuzzer" your approach to test all parameters at once is ok, > for a reliable pentest this approach is most likely useless. why? > If you simply fuzz an application this way -as much parameters as possible- > the result you get needs to be inspected carefully to identify potential > problems, and at the same time you have to remember that you probably > missed to test some important combinations. > As a pentester I'd not use such a fuzzer 'cause analysing its results would > be more difficult than doing the tests manually (that's my experiance). Testing combinations is a good idea, kind of a long shot to find a combination of parameters that work, but the w3af user will be really happy if he finds one. Your mail made me think about my original algorithm, what do you think about this? 1- GET the original URL 2- GET the original URL + a set of parameters with random values 3- If 1 != 2: for added_param in URL: GET the original URL + added_param compare last response with 1 This way I get the speed in the initial tests, and the assurance of the one parameter per request mode. > !! This method isn't perfect, because variables could be used but > !! they might not impact on the HTTP body result, but... it's the best we > !! can do =) > !! > !! What do you guys think? Anyone interested in coding it? > > Not that bad, but see above. > > Achim > > -- Andres Riancho http://w3af.sourceforge.net/ Web Application Attack and Audit Framework ------------------------------------------------------------------------------ Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM) software. With Adobe AIR, Ajax developers can use existing skills and code to build responsive, highly engaging applications that combine the power of local resources and data with the reach of the web. Download the Adobe AIR SDK and Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com _______________________________________________ W3af-develop mailing list W3af-develop@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/w3af-develop