On Mon, Feb 8, 2016 at 3:04 AM, sara hajili <hajili.s...@gmail.com> wrote:
> sorry i made a mistake i have a bout 1000 K doc. > i mean about 1000000 doc. > > On Mon, Feb 8, 2016 at 1:35 AM, Emir Arnautovic < > emir.arnauto...@sematext.com> wrote: > >> Hi Sara, >> Not sure if I am reading this right, but I read it as you have 1000 doc >> index and issues? Can you tell us bit more about your setup: number of >> servers, hw, index size, number of shards, queries that you run, do you >> index at the same time... >> >> It seems to me that you are running Solr on server with limited RAM and >> probably small heap. Swapping for sure will slow things down and GC is most >> likely reason for high CPU. >> >> You can use http://sematext.com/spm to collect Solr and host metrics and >> see where the issue is. >> >> Thanks, >> Emir >> >> -- >> Monitoring * Alerting * Anomaly Detection * Centralized Log Management >> Solr & Elasticsearch Support * http://sematext.com/ >> >> >> >> On 08.02.2016 10:27, sara hajili wrote: >> >>> hi all. >>> i have a problem with my solr performance and usage hardware like a >>> ram,cup... >>> i have a lot of document and so indexed file about 1000 doc in solr that >>> every doc has about 8 field in average. >>> and each field has about 60 char. >>> i set my field as a storedfield = "false" except of 1 field. // i read >>> that this help performance. >>> i used copy field and dynamic field if it was necessary . // i read that >>> this help performance. >>> and now my question is that when i run a lot of query on solr i faced >>> with >>> a problem solr use more cpu and ram and after that filled ,it use a lot >>> swapped storage and then use hard,but doesn't create a system file! >>> solr >>> fill hard until i forced to restart server to release hard disk. >>> and now my question is why solr treat in this way? and how i can avoid >>> solr >>> to use huge cpu space? >>> any config need?! >>> >>> >> >