Unfortunately there may be a significant issue with the functionality as well. I'm having some difficulty crawling a site based on depth (only crawling up to a certain depth). This is related to https://issues.apache.org/jira/browse/DROIDS-56 and the ongoing discussion there. The gist of it is that, if I'm not missing something, DROIDS-56 removed the existing validator functionality (although the validators themselves are still in the droids codebase), without replacing it with something else. A suggestion was made to use the filters to achieve the same functionality, but that is not possible without some work. This means that the only way to crawl a site now is to crawl all of it, with no regards to depth, which I see as a major problem. Any thoughts on this? If it's OK, I would start work on a temporary small temporary patch for this, and follow up with more work in 0.2, so that the timeline for the release is not affected to much. Thanks.
On Mon, May 23, 2011 at 12:42 PM, Thorsten Scherler <[email protected]>wrote: > ... > > The vote has failed receiving a binding -1 vote. The vote failed due to > > issues with license headers and the source artifact. All testing showed > > that the actual code performed just fine. > > > > Tonight (US time) I will rollback the release. Over the next couple of > > days I will move docs out of the source tree, re-run RAT over the > > remaining files, and generate new release artifacts from a fresh > > checkout of the code. That should give us a nice and clean release. If I > > missed anything, please let me know. > > Thank you for all your hard work! > > salu2 > -- > Thorsten Scherler <thorsten.at.apache.org> > codeBusters S.L. - web based systems > <consulting, training and solutions> > http://www.codebusters.es/ > >
