On Tue, 13 May 2008, Tim Kientzle wrote:
> > I think this is a really bad idea.  The problem with the tools is
> > not with the files.  It is that the files need to be parsed on each
> > run, often recursively, and your solution would not help at all.
>
> Parsing one file isn't expensive; parsing several hundred files
> to find one bit of information is expensive.
>
> > The database(s) should just be a cache of the information stored in
> > the files.
>
> Bingo!  As long as the .db version can be easily recreated
> from scratch from the master data stored in the same files
> as always, it doesn't really matter if the BDB is occasionally
> corrupted, as long as it can be rebuilt fairly quickly.

So long as you can tell it is corrupted..
It's also a drag from a user POV when the tool crashes because the DB is 
hosed (seen in portupgrade a number of times)

-- 
Daniel O'Connor software and network engineer
for Genesis Software - http://www.gsoft.com.au
"The nice thing about standards is that there
are so many of them to choose from."
  -- Andrew Tanenbaum
GPG Fingerprint - 5596 B766 97C0 0E94 4347 295E E593 DC20 7B3F CE8C

Attachment: signature.asc
Description: This is a digitally signed message part.

Reply via email to