So I am familiar with SOLR. I have built fairly extensive solr indexes
already. What I am trying to do now
is build very large indexes and then move them over to a sharded server
environment that will allow clients to query them. 

These indexes will be built once and will not be updated or need deletions. 

Solr only exposes a REST api and does not allow indexes to be built (locally
via an API) and then copied to a directory and have a Solr server started
over them. 

Lucene has a very basic indexing API that I am surprised SOLR does not
support via their schema.

Looks like I will need to try something like Katta. In my opinion this is a
fundamental design decision where solr REALLY dropped the ball for very
large scale indexing. At least I know now...



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Copy-lucene-index-into-Solr-tp3997078p3997251.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to