For what it's worth, after some 5 days of work, and a couple of schema changes 
to boot, LCF now runs with Derby.
Some caveats:

(1)     You can't run more than one LCF process at a time.  That means you need 
to either run the daemon or the crawler-ui web application, but you can't run 
both at the same time.
(2)     I haven't tested every query, so I'm sure there are probably some that 
are still broken.
(3)     It's slow.  Count yourself as fortunate if it runs 1/5 the rate of 
Postgresql for you.
(4)     Transactional integrity hasn't been evaluated.
(5)     Deadlock detection and unique constraint violation detection is 
probably not right, because I'd need to cause these errors to occur before 
being able to key off their exception messages.
(6)     I had to turn off the ability to sort on certain columns in the reports 
- basically, any column that was represented as a large character field.

Nevertheless, this represents an important milestone on the path to being able 
to write some kind of unit tests that have at least some meaning.

If you have an existing LCF Postgresql database, you will need to force an 
upgrade after going to the new trunk code.  To do this, repeat the 
"org.apache.lcf.agents.Install" command, and the 
"org.apache.lcf.agents.Register org.apache.lcf.crawler.system.CrawlerAgent" 
command after deploying the new code.  And, please, let me know of any kind of 
errors you notice that could be related to the schema change.

Thanks,
Karl


Reply via email to