On 2010-04-07 19:24, Enis Söztutar wrote:

>>> Also, the goal of the crawler-commons project is to provide APIs and
>>> implementations of stuff that is needed for every open source crawler
>>> project, like: robots handling, url filtering and url normalization, URL
>>> state management, perhaps deduplication. We should coordinate our
>>> efforts, and share code freely so that other projects (bixo, heritrix,
>>> droids) may contribute to this shared pool of functionality, much like
>>> Tika does for the common need of parsing complex formats.
>>>
>>>      
> 
> So, it seems that at some point, we need to bite the bullet, and
> refactor plugins, dropping backwards compatibility.

Right, that was my point - now is the time to break it, with the
cut-over to 2.0, and leaving 1.1 branch in a good shape, to serve well
enough in the interim period.


-- 
Best regards,
Andrzej Bialecki     <><
 ___. ___ ___ ___ _ _   __________________________________
[__ || __|__/|__||\/|  Information Retrieval, Semantic Web
___|||__||  \|  ||  |  Embedded Unix, System Integration
http://www.sigram.com  Contact: info at sigram dot com

Reply via email to