Taras, On Sun, Dec 21, 2008 at 7:43 PM, Taras P. Ivashchenko <[email protected]> wrote: > > Hello, list! > > Today as I know there is no deep param for discovery of new URLs in webSpider > - it try to find all URLs in target. > Some times it takes a lot of time (especially for big webapps). > As I think it well be useful to have some deep limiter param in webSpider. > > For example, we have target URL http://localhost > In the first level webSpider has found those URLs: > http://localhost/foo.php > http://localhost/foo1.php > Then (on the second level) webSpider parse http://localhost/foo.php and > http://localhost/foo2.php and found: > http://localhost/foo3.php > http://localhost/foo4.php > Then (on the third level) webSpider parse http://localhost/foo3.php and > http://localhost/foo4.php > .... > > and so on. > And if we have deep limiter we can control this deep. > What do you think about it?
Maybe you could try changing the maxDepth parameter inside the Misc settings of the w3af framework and try again? Let me know if that helps, Cheers, > -- > Тарас Иващенко (Taras Ivashchenko), OSCP > www.securityaudit.ru > ---- > "Software is like sex: it's better when it's free." - Linus Torvalds > > ------------------------------------------------------------------------------ > > _______________________________________________ > W3af-develop mailing list > [email protected] > https://lists.sourceforge.net/lists/listinfo/w3af-develop > > -- Andres Riancho http://w3af.sourceforge.net/ Web Application Attack and Audit Framework ------------------------------------------------------------------------------ _______________________________________________ W3af-develop mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/w3af-develop
