Hello All,

Scenario:

My data model consist of approx. 450 fields with different types of data. We
want to include each field for indexing as a result it will create a single
SOLR document with *450 fields*. The total of number of records in the data
set is *755K*. We will be using the features like faceting and sorting on
approx. 50 fields.

We are planning to use SOLR 4.1. Following is the hardware configuration of
the web server that we plan to install SOLR on:-

CPU: 2 x Dual Core (4 cores) | RAM: 12GB | Storage: 212 GB

Questions :

1)What's the best approach when dealing with documents with large number of
fields. What's the drawback of having a single document with a very large
number of fields. Does SOLR support documents with large number of fields as
in my case?

2)Will there be any performance issue if i define all of the 450 fields for
indexing? Also if faceting is done on 50 fields with document having large
number of fields and huge number of records?

3)The name of the fields in the data set are quiet lengthy around 60
characters. Will it be a problem defining fields with such a huge name in
the schema file? Is there any best practice to be followed related to naming
convention? Will big field names create problem during querying?

Thanks!



--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOLR-Documents-with-large-number-of-fields-450-tp4049633.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to