Have you tried watching the threads in a monitoring program like VisualVM?
We have found that at a certain point the solr software starts locking in
the synchronous calls including logging.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Raf Gemmail
We have a solr plugin that would be much easier to write if commons-lang was
available. Why does solr not have this library? Is there any drawbacks to
pulling in the commons lang for StringUtils?
--
Jeff Newburn
Software Engineer, Zappos.com
programmatic
way we could possibly understand this scenario and return correct results.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com
the system to run out of heap.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Tue, 06 Oct 2009 17:21:47 -0400
To: solr-user@lucene.apache.org
Subject: Re: Solr Trunk Heap
So could that potentially explain our use of more ram on indexing? Or is
this a rare edge case.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Tue, 06 Oct 2009 15:30:50 -0400
Ok I have eliminated all queries for warming and am still getting the heap
space dump. Any ideas at this point what could be wrong? This seems like a
huge increase in memory to go from indexing without issues to not being able
to even with warming off.
--
Jeff Newburn
Software Engineer
to reindex.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Yonik Seeley yo...@lucidimagination.com
Reply-To: solr-user@lucene.apache.org
Date: Mon, 5 Oct 2009 13:32:32 -0400
To: solr-user@lucene.apache.org
Subject: Re: Solr Trunk Heap Space Issues
no
idea where the LRUCache is getting its information or what is even in there.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Yonik Seeley yo...@lucidimagination.com
Reply-To: solr-user@lucene.apache.org
Date: Mon, 5 Oct 2009 13:32:32 -0400
To: solr-user
Ah yes we do have some warming queries which would look like a search. Did
that side change enough to push up the memory limits where we would run out
like this? Also, would FastLRU cache make a difference?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From
: 0.00
cumulative_inserts : 0
cumulative_evictions : 0
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Yonik Seeley yo...@lucidimagination.com
Reply-To: solr-user@lucene.apache.org
Date: Fri, 2 Oct 2009 10:04:27 -0400
To: solr-user@lucene.apache.org
I reran the test to try to ensure that other cores on the instance didn't
have searches against them. This time I get NPE errors just trying to get
into the stats after the system hits its limit.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Jeff
Oct 1, 2009 8:40:12 AM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
Added the parameter and it didn't seem to dump when it hit the gc limit
error. Any other thoughts?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Bill Au bill.w...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Thu, 1 Oct 2009 12:16:53 -0400
my May version did this
without any problems whatsoever.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Thu, 01 Oct 2009 17:57:28 -0400
To: solr-user@lucene.apache.org
identical
core takes only a fraction of a second. Is there something with that core
that could be bogging down the whole instance?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Jeffery Newburn jnewb...@zappos.com
Reply-To: solr-user@lucene.apache.org
Date
Ah that makes more sense. It does seem that the coord would be a good
option especially in cases like this.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Yonik Seeley yo...@lucidimagination.com
Reply-To: solr-user@lucene.apache.org
Date: Fri, 11 Sep
(productNameSearch:blue)=1)
8.033478 = idf(docFreq=120, numDocs=136731)
0.625 = fieldNorm(field=productNameSearch, doc=8142)
/str
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
We also use 1.4 which has gotten hit with load tests of up to
2000queries/sec. Biggest thing is make sure you are using the slaves for
that kind of load. Other than that 1.4 is pretty impressive.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Otis
Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Erik Hatcher e...@ehatchersolutions.com
Reply-To: solr-user@lucene.apache.org
Date: Wed, 22 Jul 2009 00:36:30 -0400
To: solr-user@lucene.apache.org
Subject: Re: Random Slowness
On Jul 21, 2009, at 6:52 PM, Jeff
Ed,
How do I go about enabling the gc logging for solr?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Ed Summers e...@pobox.com
Reply-To: solr-user@lucene.apache.org
Date: Wed, 22 Jul 2009 10:39:03 -0400
To: solr-user@lucene.apache.org
Subject: Re
+Kids} hits=17 status=0 QTime=3789
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
) you
would search (Phone: *-111-). Keep in mind this way will work
syntactically but basically changes the index into a file sort so you will
see a performance dip.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Erik Hatcher e
Does anyone have links or books to recommended reading on search in general.
Would like to see some literature on larger search concepts and ideas.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
is hit. We
have it set to both optimize and commit.
str name=replicateAftercommit/str
str name=replicateAfteroptimize/str
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Gurjot Singh gurjot...@gmail.com
Reply-To: solr-user
to use.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: solenweg daniel_ly...@hotmail.com
Reply-To: solr-user@lucene.apache.org
Date: Thu, 9 Jul 2009 03:51:04 -0700 (PDT)
To: solr-user@lucene.apache.org
Subject: Search results depending on search word
We are using a trunk build from approximately the same time with little to
no issues including the new replication.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Shalin Shekhar Mangar shalinman...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date
=buildOnCommittrue/str
/lst
Second we added the component to the dismax handler:
arr name=last-components
strspellcheck/str
/arr
This seems to work for us. Hope it helps
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Yao Ge yao
We are using solr 1.4 on trunk as of 5/7/2009. What patch did you want us
to apply?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com
Reply-To: solr-user@lucene.apache.org
Date: Mon, 25 May 2009 17:16
am
not sure why it is that the system is seeing the jar in the main lib
directory and then choosing to ignore it.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Tue, 26 May
That is exactly what we were missing. As soon as we added the sharedlib in
solr.xml it started working. Thank you very much for all the help on this
one. It is greatly appreciated.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil
AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 2434 ms
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Grant Ingersoll gsing...@apache.org
Reply-To: solr-user@lucene.apache.org
Date: Thu, 21 May 2009 16:02:16 -0400
To: solr-user
/FacetCubeComponent.jar' to Solr
classloader
However as soon as it tries the component it cannot find the class.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com
Reply-To: solr-user@lucene.apache.org
Date: Thu
One additional note we are on 1.4 tunk as of 5/7/2009. Just not sure why it
won't load since it obviously works fine if directly inserted into the
WEB-INF directory.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Mark Miller markrmil...@gmail.com
Reply
)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29
4)
... 36 more
--
Jeff Newburn
Software Engineer
(ClassLoader.java:319)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:29
4)
... 27 more
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
=org.apache.solr.handler.component.FacetCubeComponent/
arr name=last-components
strspellcheck/str
strfacetcube/str
/arr
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
the
latest 1?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Noble Paul നോബിള് नोब्ळ् noble.p...@gmail.com
Reply-To: solr-user@lucene.apache.org
Date: Wed, 6 May 2009 10:05:49 +0530
To: solr-user@lucene.apache.org
Subject: Re: no subject aka
On May 6, 2009, at 3:25 PM, Jeff Newburn wrote:
We are trying to implement a SearchCompnent plugin. I have been
looking at
QueryElevateComponent trying to weed through what needs to be done.
My
basic desire is to get the results back and manipulate them either by
altering the actual
Excellent! Thank you I am going to start testing that.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com
Reply-To: solr-user@lucene.apache.org
Date: Thu, 7 May 2009 20:26:02 +0530
To: solr-user
to the client?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
We see the exact same thing. Additionally, that url returns 404 on a
multicore and gives an error when I add the core.
−
response
−
lst name=responseHeader
int name=status0/int
int name=QTime0/int
/lst
str name=statusno indexversion specified/str
/response
--
Jeff Newburn
Software Engineer
=nameelevate.xml/str
long name=lastmodified123621333/long
long name=checksum790732532/long
long name=size1274/long
/lst
?
lst
str name=namesynonyms.txt/str
long name=lastmodified1237990595000/long
long name=checksum816919275/long
long name=size68713/long
/lst
/arr
/response
--
Jeff Newburn
Software
What are the current issues holding this back? Seems to be working with
some minor bug fixes.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Otis Gospodnetic otis_gospodne...@yahoo.com
Reply-To: solr-user@lucene.apache.org
Date: Sun, 19 Apr 2009 20:30
,expandedGender/str
int name=mlt.mindf1/int
int name=mlt.mintf1/int
/lst
arr name=last-components
strcollapse/str
strspellcheck/str
/arr
/requestHandler
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com
We are currently trying to do the same thing. With the patch unaltered we
can use fq as long as collapsing is turned on. If we just send a normal
document level query with an fq parameter it blows up.
Additionally, it does not appear that the collapse.facet option works at
all.
--
Jeff
Fastest way I know of to get the schema is using the luke browser.
http://localhost/solr/admin/luke
It returns in xml and has tons of info you probably aren't interested it.
However, it does contain information like fields and type.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb
it will have
all widths available for all documents that match on size 7 even though most
don¹t come in a wide width. We are looking for strategies to filter facets
based on other facets in separate queries.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
I apologize for the delay. The replication stalling out doesn't happen
daily. I will paste the thread dump below to try to help. This is on a
server that is currently locked on replication for a few hours. Any more
information please let me know. There are no errors in the logs either so
very
I have noticed that I can¹t seem to make sense of the histogram. For every
field the x-axis shows powers of 2 which make no sense for things like brand
name. Am I looking at it wrong or is it having issues?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
Mar 23 00:22:55 PDT 2009
Files Downloaded: 12 / 163
Downloaded: 4.12 MB / 1.41 GB [0.0%]
Downloading File: _5no.tis, Downloaded: 0 bytes / 629.57 KB [0.0%]
Time Elapsed: 26371s, Estimated Time Remaining: 9216278s, Speed: 163
bytes/s
--
Jeff Newburn
Software Engineer, Zappos.com
I did an ant clean and then dist and it is still showing. I attached the
servlet class and java files. The svn up says: At revision 749397.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Akshay akshay.u...@gmail.com
Reply-To: solr-user
be appreciated.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
From: Jeff Newburn jnewb...@zappos.com
Reply-To: solr-user@lucene.apache.org
Date: Mon, 02 Mar 2009 10:20:36 -0800
To: solr-user@lucene.apache.org
Subject: Re: Trunk Replication Page Issue
I
(JspServletWrapper.java:3
74)
... 24 more
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
In your example there is no space between +wow -kill so my guess is that
solr is interpreting it as wow-kill all one word. Then depending on the
field type the tokenizer is probably splitting wow and kill into 2 words
along the -.
--
Jeff Newburn
From: sunnyfr johanna...@gmail.com
Reply
Unfortunately, the stopword filter acts funny (depending on who you ask) in
dismax. The short version is that the stopwords filter has to be on all
fields being queried on for minimum matches to work. We have the same issue
with one of our brands. We require all word matching so The North Face
I am getting the following on the stats page.
XML Parsing Error: not well-formed
Location: http://solr2.zappos.net:8080/solr/zeta-main/admin/stats.jsp
Line Number 1327, Column 48:stat name=item_attrFacet_Size__Shape
---^
Not sure what the deal
in stats.jsp, you could
turn off showing these items by setting showItems to 0 in
solrconfig.xml. You must have it set to something greater than 0,
right? Is that a recent change on your system?
Erik
On Feb 5, 2009, at 1:56 PM, Erik Hatcher wrote:
On Feb 5, 2009, at 1:44 PM, Jeff
in lots more places to peace of mind's sake, but
ultimately I think we want stats.jsp to deprecate in favor of a
request handler outputting the stats.
Erik
On Feb 5, 2009, at 2:40 PM, Jeff Newburn wrote:
Unfortunately, I don't have any reference to showItems at all in the
solrconfig.xml
I know this question was asked a while back but is there a timeframe for
when solr 1.4 will be put into the stable release category?
-Jeff
We are moving from single core to multicore. We have a few servers that we
want to migrate one at a time to ensure that each one functions. This
process is proving difficult as there is no default core to allow the
application to talk to the solr servers uniformly (ie without a core name
during
URL to non-multi-core
URL for all solr instances.
- upgrade app to use multi-core URL
- upgrade solr instances to multi-core when convenient and remove
rewrite filter
-Bryan
On Jan 28, 2009, at Jan 28, 7:17 AM, Jeff Newburn wrote:
We are moving from single core to multicore
Are there any error log messages?
The difference between a string and text is that string is basically stored
with no modification (it is the solr.StrField). The text type is actually
defined in the fieldtype section and usually contains a tokenizer and some
analyzers (usually stemming,
on the admin page reporting the error, but is that what you want?
Jeff Newburn wrote:
Are there any error log messages?
The difference between a string and text is that string is basically
stored
with no modification (it is the solr.StrField). The text type is actually
defined
)
Jeff Newburn wrote:
The first 10-15 lines of the jargon might help. Additionally, the full
exceptions will be in the webserver logs (ie tomcat or jetty logs).
On 1/23/09 10:40 AM, Johnny X jonathanwel...@gmail.com wrote:
Ah, gotcha.
Where do I go to find the log messages
The best way to find out what was wrong with the request is going to be the
web server logs. It should throw an exception that usually complains about
fields missing or incorrect.
As to the committing solr has an autocommit option that will fire after a
designated amount of changes have been
We are seeing something very similar. Ours is intermittent and usually
happens a great deal on random days. Often it seems to occur during large
index updates on the master.
On 1/22/09 8:58 AM, Shalin Shekhar Mangar shalinman...@gmail.com wrote:
On Thu, Jan 22, 2009 at 10:18 PM, Jaco
My apologies. No we are using linux, tomcat setup.
On 1/22/09 9:15 AM, Shalin Shekhar Mangar shalinman...@gmail.com wrote:
On Thu, Jan 22, 2009 at 10:37 PM, Jeff Newburn jnewb...@zappos.com wrote:
We are seeing something very similar. Ours is intermittent and usually
happens a great deal
नोब्ळ् noble.p...@gmail.com
wrote:
Jeff ,
Do you see both the empty index. dirs as well as the extra files
in the index?
--Noble
On Thu, Jan 22, 2009 at 10:37 PM, Jeff Newburn jnewb...@zappos.com wrote:
We are seeing something very similar. Ours is intermittent and usually
happens
, Jan 23, 2009 at 12:00 AM, Jeff Newburn jnewb...@zappos.com wrote:
We have both. A majority of them are just empty but others have almost a
full index worth of files. I have also noticed that during a lengthy index
update the system will throw errors about how it cannot move one of the
index
Can someone please make sense of why the following occurs in our system.
The first item barely matches but scores higher than the second one that
matches all over the place. The second one is a MUCH better match but has a
worse score. These are in the same query results. All I can see are the
I have set up an ngram filter and have run into a problem. Our index is
basically composed of products as the unique id. Each product also has a
brand name assigned to it. There are much fewer unique brand names than
products in the index. I tried to set up an ngram based on the brand name
but
.
On Thu, Dec 11, 2008 at 8:33 PM, Jeff Newburn
jnewb...@zappos.com wrote:
Thank you for the quick response. I will keep
an eye on that to see how it
progresses.
On 12/10/08 8:03 PM, Noble
Paul നോബിള് नोब्ळ् noble.p...@gmail.com
wrote:
This is a known
issue and I was planning
at 5:30 AM, Jeff Newburn [EMAIL PROTECTED] wrote:
I am curious as to whether there is a solution to be able to replicate
solrconfig.xml with the 1.4 replication. The obvious problem is that the
master would replicate the solrconfig turning all slaves into masters with
its config. I have also
I have discovered some weirdness with our Minimum Match functionality.
Essentially it comes up with absolutely no results on certain queries.
Basically, searches with 2 words and 1 being ³the² don¹t have a return
result. From what we can gather the minimum match criteria is making it
such that if
I am curious as to whether there is a solution to be able to replicate
solrconfig.xml with the 1.4 replication. The obvious problem is that the
master would replicate the solrconfig turning all slaves into masters with
its config. I have also tried on a whim to configure the master and slave
on
Unfortunately, as it stands the interestingTerms and the debugQuery do not
explain why solr chose the matches it did for moreLikeThis. There is
currently a task in jira to try to add the information to debugQuery.
The ticket can be found here: https://issues.apache.org/jira/browse/SOLR-860
I am trying to use the api for the solr cores. Reload works great but when
I try to UNLOAD I get a massive exception in IOException. It seems to
unload the module but doesn¹t remove it from the configuration file. The
solr.xml file is full read and write but still errors. Any ideas?
Solr.xml
Ok just FYI solr replaces the file instead of editing. This means that the
webserver needs permissions in the directory to delete and create the
solr.xml file. Once I fixed that it no longer gave IOException errors.
On 11/20/08 8:29 AM, Jeff Newburn [EMAIL PROTECTED] wrote:
I am trying
I am trying to figure out how the synonym filter processes multi word
inputs. I have checked the analyzer in the GUI with some confusing results.
The indexed field has ³The North Face² as a value. The synonym file has
morthface, morth face, noethface, noeth face, norhtface, norht face,
nortface,
I am trying to get the onlyMorePopular variable to function correctly. I
have tried adding both spellchecker.onlyMorePopular as well as
sp.query.onlyMorePopular yet neither of these seem to change the spelling
suggestion response. I am not sure if I simply do not understand what it is
intended
, Jeff Newburn wrote:
Thanks Otis! I understand what the parameters do to help modify for
the
most part. What I am trying to understand is how the related items
are
related. Certain search results return thousands of documents that
are
related but since I have 6 fields involved
Lucas,
Did you upgrade to version 1.4 of solr? The replication is very newly
implemented and not available until very recently.
-Jeff
On 11/11/08 10:38 AM, banished phantom [EMAIL PROTECTED] wrote:
Hello everyone ! I'm new in the Solr list. I'va been using Solr 1.2 for a
while and also the
On Nov 10, 2008, at 7:00 PM, Jeff Newburn wrote:
I am still relatively new to solr. I have gotten the
spellcheckerrequesthandler working the way I would like. Now I am
diving
into the search component version of the spell checker. I was hoping
someone could help explain 1. What
Hi All,
Anybody know of how to get some useful debugging information for
morelikethis search component in SOLR 1.4? I am trying to make the results
more relevant to each product. Unfortunately, I cannot seem to find a way
to get information that would be useful to find out what is matching. I
are the knobs that will let you control quality of MLT results.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
From: Jeff Newburn [EMAIL PROTECTED]
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Tuesday, November 11
85 matches
Mail list logo