Thanks Binoy for sharing your experience that's a good suggestion. Thank you so 
much. 

Ciao,
Vincenzo

--
mobile: 3498513251
skype: free.dev

> Il giorno 03 gen 2016, alle ore 14:02, Binoy Dalal <binoydala...@gmail.com> 
> ha scritto:
> 
> AFAIK your text field can be as big as your resources make possible.
> The flip side is that, the bigger the field, the longer it'll take to
> search, and more importantly return. If you plan on storing and returning
> your entire 4/5 MB field, it is going to take you a lot of time. I mean A
> LOT! For one of my instances where we had all text fields of around 100kb
> in size with around a million similarly sized docs, it took a good 10 secs
> to search and return the fields. So you should plan accordingly. If you
> absolutely have to have such large fields then you should think about
> highlighting the results and then just returning the snippets instead of
> the entire document.
> 
> I suggest that instead of making changes to your running production system,
> you set up a separate instance where you can play around and finalize your
> schema design. Once that is done, simply index all your content and point
> your app to the new solr instance and then take down the old one. This way
> you'll be sure to not mess things up in the prod system and will have
> minimal downtime.
> 
>> On Sun, Jan 3, 2016 at 3:27 PM Vincenzo D'Amore <v.dam...@gmail.com> wrote:
>> 
>> Thanks for the answers. I'll definitely change the schema, adding new
>> fields to improve quality of results.
>> 
>> In the meanwhile I cannot throw away everything, because this is a
>> production project. What I'm worried about is the size of this text field
>> (about 4/5MB for each document). How can be big a text field? And what is
>> the performances slowdown if it is so big.
>> 
>> My idea is to add new fields with more or less relevance, depending from
>> the content, and then reduce the size of big text field in the next days.
>> 
>> Ciao,
>> Vincenzo
>> 
>> --
>> mobile: 3498513251
>> skype: free.dev
>> 
>>>> Il giorno 02 gen 2016, alle ore 15:20, Binoy Dalal <
>>> binoydala...@gmail.com> ha scritto:
>>> 
>>> Indexing the various columns of your database table as separate fields
>> into
>>> solr will allow for more flexibility in terms of filter queries, faceting
>>> and sorting your results. If you're looking to implement such features
>> then
>>> you should definitely be indexing every table column as a separate solr
>>> field.
>>> Additionally, this will also allow for greater flexibility for which
>> fields
>>> you want to query and/or display to the user.
>>> 
>>>> On Sat, Jan 2, 2016 at 7:13 PM Upayavira <u...@odoko.co.uk> wrote:
>>>> 
>>>> Ask yourself what you want out of the index, how you want to query it,
>>>> then the way to structure your index will become more clear.
>>>> 
>>>> What sort of queries do you need to execute?
>>>> 
>>>> Upayavira
>>>> 
>>>>> On Sat, Jan 2, 2016, at 01:30 PM, Vincenzo D'Amore wrote:
>>>>> Hi All,
>>>>> 
>>>>> Recently I have started to work on an new project. I found a collection
>>>>> where there are few documents (~8000), but each document has only a big
>>>>> text field with about 4/5 MB.
>>>>> 
>>>>> As far as I understood the text field is created by copying all the
>> text
>>>>> fields existing into a mssql table. So each row is transformed in a
>>>>> document and all row fields are copied into a big text solr field.
>>>>> 
>>>>> Now it seems obvious to me that rows fields should be copied into
>>>>> different solr fields but I was curious to know what's the opinion of
>> the
>>>>> forum about this situation and what's the best approach to re-design
>> the
>>>>> collection.
>>>>> 
>>>>> Bests,
>>>>> Vincenzo
>>>>> 
>>>>> --
>>>>> Vincenzo D'Amore
>>>>> email: v.dam...@gmail.com
>>>>> skype: free.dev
>>>>> mobile: +39 349 8513251
>>>>> 
>>>>> 
>>>>> Ciao,
>>>>> Vincenzo
>>>>> 
>>>>> --
>>>>> mobile: 3498513251
>>>>> skype: free.dev
>>> --
>>> Regards,
>>> Binoy Dalal
> -- 
> Regards,
> Binoy Dalal

Reply via email to