Hi all!
I am developing an online dictionary application by using Solr, but I wonder
that how many concurrent request that Solr can be process?
--
View this message in context:
http://old.nabble.com/CCU-of-Solr--tp26840318p26840318.html
Sent from the Solr - User mailing list archive at
Thanks for your answer! But how i can test this??? Do you know any tool that
help me do that? :confused:
Noble Paul നോബിള് नोब्ळ्-2 wrote:
it is very difficult to say. It depends on the cache hit ratio. If
everything is served out of cache you may go upto arounbf 1000 req/sec
On Fri,
Try solr.FastLRUCache instead of solr.LRUCache it's the new cache
gesture for solr 1.4.
And maybe reopenReaderstrue/reopenReaders in main index section or
diminish mergefactor
see http://wiki.apache.org/lucene-java/ImproveSearchingSpeed
Tomasz Kępski a écrit :
Hi,
I'm using SOLR(1.4) to
Den 17. des. 2009 kl. 13.48 skrev Shalin Shekhar Mangar:
2009/12/17 Steinar Asbjørnsen steinar...@gmail.com
Den 17. des. 2009 kl. 12.42 skrev Shalin Shekhar Mangar:
For specific cases like this, you can add the word to a file and specify
it
in schema, for example:
filter
Thanks for hanging in there and helping me
when wildcard-queries aren't analysed it makes sense for this one. But i'm
still stuck at the problem mentioned above with AndererName09. I'm not
using any wildcards, the query-string and the index-value clearly show - a
lot of - matches but still I
2009/12/18 Steinar Asbjørnsen steinar...@gmail.com
What I've done so far is to add both restaurant and restaurering to
protwords.txt.
I've also refeed a single document (with the keyword restaurering) to
check that it no longer appears in a search result for restaurant.
Do i have to refeed
Hey guys,
I'm getting some strange behavior here, and I'm wondering if I'm doing
anything wrong..
I've got an unoptimized index, and I'm trying to run the following command:
http://server:8983/solr/update?optimize=truemaxSegments=10waitFlush=false
Tried it first directly in the browser, it
My goal is to hide configuration from client application. So when I
distribute index, application does not know this.
I added new search handler and it works fine. Thanks to everybody
Jacob Elder-4 wrote:
If the goal is to save time when using the admin interface, you can just
add
this to
You can't test it until you have a working SOLR instance in
your specific problem space.
But assuming you have a SOLR setup, there are a plethora of
tools, just google SOLR load testing. JMeter has been mentioned,
as well as others.
You can also write your own load tester that just spawns a
Hmmm. What do you get when you use ?debugQuery=true? Have you
gone in through the SOLR admin page and tries queries that way? what
do you see?
Puzzles me too
Erick
On Fri, Dec 18, 2009 at 6:52 AM, QBasti sebastian.f...@gmail.com wrote:
Thanks for hanging in there and helping me
when
Hi Aleksander,
Aleksander Stensby wrote:
So i tried with curl:
curl http://server:8983/solr/update --data-binary 'optimize/' -H
'Content-type:text/xml; charset=utf-8'
No difference here either... Am I doing anything wrong? Do i need to issue a
commit after the optimize?
Did you restart the
You can add click counts to your index as additional field and boost
results based on that value.
http://wiki.apache.org/solr/SolrRelevancyFAQ#How_can_I_change_the_score_of_a_document_based_on_the_.2Avalue.2A_of_a_field_.28say.2C_.22popularity.22.29
You can keep some kind of buffer for clicks
hi all,
I am disparately in need some webpages that uses solr at backend and
display the results..
Plz can u send some PHP codes tht resembles like Google that uses Solr.
Im new to solr.
Hello,
There are many sites that are using Solr
YoAuto.com - Auto parts shopping comparison engine is using solr on the
backend. Its not a PHP site.
Paul
On Fri, Dec 18, 2009 at 9:34 AM, Naga raja jollyn...@gmail.com wrote:
hi all,
I am disparately in need some webpages that uses solr
http://our.gop.com/app/render/go.aspx?xsl=search.xsltsearchTerm=*:*searc
hFilter=SearchObjectCategory:USER_PROFILE
It returns well formed XML or JSoN, simply utilize it as a rest-like web
service. Most sites you visit that utilize it will not expose that
consumption logic in client visible code.
I used ab(apache bench) to test with handle 1000 requests, with a maximum of
300 requests running concurrently (ab -n 1000 -c 300), and then I received
the output as follows:
Concurrency Level: 300
Time taken for tests: 6.797 seconds
Complete requests: 1000
Failed requests:0
ab is not the best testing tool to test website performance
try loadrunner or apache httpclient instead, they act more like an browser
On Sat, Dec 19, 2009 at 11:20 AM, Olala hthie...@gmail.com wrote:
I used ab(apache bench) to test with handle 1000 requests, with a maximum
of
300 requests
17 matches
Mail list logo