Thats what I thought. I think I'll take the time to add something to the DIH to 
prevent such things. Maybe a parameter that will cause the import to bail out 
if the documents to index are less than X % of the total number of documents 
already in the index.

There would also be a parameter to override this manually.

I think it would be a good safety precaution.

Daniel Shane

----- Original Message -----
From: "Noble Paul നോബിള്‍ नोब्ळ्" <noble.p...@corp.aol.com>
To: solr-user@lucene.apache.org
Sent: Wednesday, February 17, 2010 12:36:52 AM
Subject: Re: Preventing mass index delete via DataImportHandler full-import

On Wed, Feb 17, 2010 at 8:03 AM, Chris Hostetter
<hossman_luc...@fucit.org> wrote:
>
> : I have a small worry though. When I call the full-import functions, can
> : I configure Solr (via the XML files) to make sure there are rows to
> : index before wiping everything? What worries me is if, for some unknown
> : reason, we have an empty database, then the full-import will just wipe
> : the live index and the search will be broken.
>
> I believe if you set clear=false when doing the full-import, DIH won't
it is clean=false

or use command=import instead of command=full-import
> delete the entire index before it starts.  it probably makes the
> full-import slower (most of the adds wind up being deletes followed by
> adds) but it should prevent you from having an empty index if something
> goes wrong with your DB.
>
> the big catch is you now have to be responsible for managing deletes
> (using the XmlUpdateRequestHandler) yourself ... this bug looks like it's
> goal is to make this easier to deal with (but i'd not really clear to
> me what "deletedPkQuery" is ... it doesnt' seem to be documented.
>
> https://issues.apache.org/jira/browse/SOLR-1168
>
>
>
> -Hoss
>
>



-- 
-----------------------------------------------------
Noble Paul | Systems Architect| AOL | http://aol.com

Reply via email to