Taras,

On Sun, Apr 8, 2012 at 5:30 PM, Taras <ox...@oxdef.info> wrote:
> Hi, all!
>
> I'm thinking about how can w3af can be improved in case of number of
> HTTP requests per scan. Currently a big number not always necessary
> requests is made in discovery and audit stages. We already have some
> code to minimize this number but I also want to suggest single place for
> discovery plugins to implement limits for request's variants and checks
> for already reported requests. It can be implemented as shared between
> all discovery plugins fuzzableRequestList (child of Python's list)
> object. For example it can be member of baseDiscoveryPlugin.
> And baseDiscoveryPlugin.discover() will still return it like in current
> realization. In fuzzableRequestList while adding new request checks for
> variants can be made.

    I definitely like the idea of reducing the amount of HTTP requests
performed by discovery plugins, that's for sure one of the most
annoying things that users come across when starting to use w3af.
Before we dive into your solution, let me try to explain how it should
work (at least in my initial idea... not sure if it is working like
this right now):

* Discovery plugins x and y are enabled
* URL 1 is set as target
* x is fed with URL 1 as starting point
* x is run and finds URLs 2, 3
* y is fed with URL 1 as starting point
    - If using the technique implemented in "y", it finds the URL "2"
and "4", "y" will return that to the w3afCore
    - The w3afCore should at that point say: "I already know about 2,
thanks but I won't add duplicates", "I'll add URL 4 to my list"

    So, as you can see in that case, there shouldn't be any
duplicates. Lets analyze a more complex scenario:

* Discovery plugins x and y are enabled
* URL 1 is set as target
* x is fed with URL 1 as starting point
* x is run and finds URLs 2, 3. For some reason we don't care about it
performs HTTP GET requests to 2 and 3.
* y is fed with URL 1 as starting point
    - If using the technique implemented in "y", it finds the URL "2" and "4"
    - For some reason we don't care about, it also performs HTTP GET
requests to "2" and "4".
    - The HTTP requests are performed using the xUrllib
    - xUrllib has a cache, and because a request to "2" was already
performed, it should take the response out of the cache; no network
traffic is generated
    - The request for URL "4" is sent to the network
    - Plugin "y" returns the "2" and "4" knowledge to the w3afCore
    - The w3afCore should at that point say: "I already know about 2,
thanks but I won't add duplicates", "I'll add URL 4 to my list"

    If the framework IS working like this, I think that the shared
fuzzable request list wouldn't do much good. If it is not working like
this (and I would love to get an output log to show it), it seems that
we have a lot of work ahead of us.

PS: fuzzableRequestList can't be a child of Python's list, we
shouldn't have all those items in memory

Regards,
> --
> Taras
> http://oxdef.info
>
> ------------------------------------------------------------------------------
> For Developers, A Lot Can Happen In A Second.
> Boundary is the first to Know...and Tell You.
> Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
> http://p.sf.net/sfu/Boundary-d2dvs2
> _______________________________________________
> W3af-develop mailing list
> W3af-develop@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/w3af-develop



-- 
Andrés Riancho
Project Leader at w3af - http://w3af.org/
Web Application Attack and Audit Framework

------------------------------------------------------------------------------
For Developers, A Lot Can Happen In A Second.
Boundary is the first to Know...and Tell You.
Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
http://p.sf.net/sfu/Boundary-d2dvs2
_______________________________________________
W3af-develop mailing list
W3af-develop@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/w3af-develop

Reply via email to