We had a similar issue using acts_as_solr. We already had lighttpd
running on some servers so we just proxied all requests for /solr/CORE/
update to the master and /solr/CORE/select to a load balanced IP for
our slaves.
Doug
On Jun 19, 2009, at 11:42 AM, Mark A. Matienzo wrote:
I'm tryin
http://issues.apache.org/jira/browse/SOLR-405 ?
It's quite old and it's exactly what you want, but I think it might be
the JIRA ticket that Otis mentioned. Using a filter query was what we
really needed. I'm also not really sure why you need a dismax query
at all. You're not querying for
Hah. Sorry, I'm really out of it today.
The MoreLikeThisComponent doesn't seem to work for filtering using fq,
but the MoreLikeThisHandler does.
Problem solved, we'll just use the handler instead of a component.
Doug
On Mar 4, 2009, at 11:02 AM, Doug Steigerwald wrot
t=true
The popularity of the doc found is 6, and trying to use 'fq=popularity:
6' brings back similarities with a popularity other than 6.
Doug
On Mar 4, 2009, at 10:39 AM, Doug Steigerwald wrote:
Hm. I checked out a clean Solr 1.3.0 and indexed the example docs
and set up a simple
he good old 'fq' not work with MLT? It should...
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message ----
From: Doug Steigerwald
To: solr-user@lucene.apache.org
Sent: Wednesday, March 4, 2009 9:20:40 AM
Subject: MoreLikeThis filtering
Is it possib
atext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Doug Steigerwald
To: solr-user@lucene.apache.org
Sent: Wednesday, March 4, 2009 9:20:40 AM
Subject: MoreLikeThis filtering
Is it possible to filter similarities found by the MLT component/
handler?
Something like mlt.fq=
ve to make mods
(I'm not seeing anything jump out at me in the Solr 1.3.0 or Lucene
2.4.0 code)?
/solr/dsteiger/mlt?q=story_id:188665+AND+site_id:86&mlt.fq=site_id:86
(We have all all of our other defaults set up in the handler config.)
Thanks.
---
Doug Steigerwald
Software Devel
Have you tried just checking out (or exporting) the source from SVN
and applying the patch? Works fine for me that way.
$ svn co http://svn.apache.org/repos/asf/lucene/solr/tags/
release-1.3.0 solr-1.3.0
$ cd solr-1.3.0 ; patch -p0 < ~/Downloads/collapsing-patch-to-1.3.0-
ivan_2.patch
Doug
Try using the -d option with the snappuller so you can specify the
path to the directory holding index data on local machine.
Doug
On Dec 10, 2008, at 10:20 AM, Kashyap, Raghu wrote:
Bill,
Yes I do have scripts.conf for each core. However, all the options
needed for snappuller is specifie
The first output is from the query component. You might just need to
make the collapse component first and remove the query component
completely.
We perform geographic searching with localsolr first (if we need to),
and then try to collapse those results (if collapse=true). If we
don't
We actually have this same exact issue on 5 of our cores. We're just
going to wipe the index and reindex soon, but it isn't actually
causing any problems for us. We can update the index just fine,
there's just no merging going on.
Ours happened when I reloaded all of our cores for a schem
Right before I sent the message. Did a 'svn up src/;and clean;ant
dist' and it failed. Seems to work fine now.
On Aug 14, 2008, at 2:38 PM, Ryan McKinley wrote:
have you updated recently?
isEnabled() was removed last night...
On Aug 14, 2008, at 2:30 PM, Doug Steigerwald wr
PM, Grant Ingersoll wrote:
I believe I just fixed this on SOLR-606 (thanks to Stefan's patch).
Give it a try and let us know.
-Grant
On Aug 13, 2008, at 2:25 PM, Doug Steigerwald wrote:
I've noticed a few things with the new spellcheck component that
seem a little strange.
Here
2008, at 2:52 PM, Doug Steigerwald wrote:
OK. Last question for a while (hopefully), but something else with
multicore seems to be wrong.
$ java -jar start.jar
...
INFO: [core0] Opening new SolrCore at solr/core0/, dataDir=./solr/
data/
...
INFO: [core1] Opening new SolrCore at
:22 PM, Doug Steigerwald wrote:
Yeah, that's the problem. Not having the core in the URL you're
posting to shouldn't update any core, but it does.
Doug
On Aug 13, 2008, at 2:10 PM, Alok K. Dhir wrote:
you need to add the core to your call -- post to
http://localhost:8983/so
OK. Last question for a while (hopefully), but something else with
multicore seems to be wrong.
$ java -jar start.jar
...
INFO: [core0] Opening new SolrCore at solr/core0/, dataDir=./solr/data/
...
INFO: [core1] Opening new SolrCore at solr/core1/, dataDir=./solr/data/
...
I've noticed a few things with the new spellcheck component that seem
a little strange.
Here's my document:
5
wii blackberry blackjack creative labs zen ipod
video nano
Some sample queries:
http://localhost:8983/solr/core1/spellCheckCompRH?q=blackberri+wi&spellcheck=true&spellcheck.
008, at 1:58 PM, Doug Steigerwald wrote:
I've got two cores (core{0|1}) both using the provided example
schema (example/solr/conf/schema.xml).
Posting to http://localhost:8983/solr/update added the example docs
to the last core loaded (core1). Shouldn't this give you a 400?
I've got two cores (core{0|1}) both using the provided example schema
(example/solr/conf/schema.xml).
Posting to http://localhost:8983/solr/update added the example docs to
the last core loaded (core1). Shouldn't this give you a 400?
Doug
Just checked out Solr trunk from SVN and ran 'ant dist && ant
example'. Running the example throws out errors because there is no
WordGramFilterFactory class.
We don't need it here, but is that something waiting to be committed?
Doug
--Snippet from schema--
positionIncrementGap="100"
l.java:640)
at org.apache.tomcat.util.net.AprEndpoint
$Worker.run(AprEndpoint.java:1286)
at java.lang.Thread.run(Thread.java:619)
P.P.S.
I'll send thread dump in separate Email
Quoting Doug Steigerwald <[EMAIL PROTECTED]>:
It happened again last night. I cro
roblem
yet,
let's see...
Strange: Tomcat simply hangs instead of exit(...)
There are some posts related to OutOfMemoryError in solr-user list.
==
http://www.linkedin.com/in/liferay
Quoting Doug Steigerwald <[EMAIL PROTECTED]>:
Since we pushed Solr out to production a
..
Strange: Tomcat simply hangs instead of exit(...)
There are some posts related to OutOfMemoryError in solr-user list.
==
http://www.linkedin.com/in/liferay
Quoting Doug Steigerwald <[EMAIL PROTECTED]>:
Since we pushed Solr out to production a few weeks ago, we've
Since we pushed Solr out to production a few weeks ago, we've seen a
few issues with Solr not responding to requests (searches or admin
pages). There doesn't seem to be any reason for it from what we can
tell. We haven't seen it in QA or development.
We're running Solr with basically the
We're experiencing some high load on our Solr master server. It
currently has 30 cores and processes over 3 million updates per day.
During most of the day the load on the master is low (0.5 to 2), but
sometimes we get spikes in excess of 12 for hours at a time.
The only reason I can figu
1262
What version of Lucene is in the Solr you are running? You might want
to try either one of the latest Solr nightly builds, or at least
upgrading your Lucene version in Solr if it's not the latest patch
release.
-Yonik
On Wed, Jul 2, 2008 at 9:03 AM, Doug Steigerwald
<[EMAIL PROTE
What exactly does this error mean and how can we fix it? As far as I
can tell, all of our 30+ cores seem to be updating and autocommiting
fine. By fine I mean our autocommit hook is firing for all cores
which leads me to believe that the commit is happening, but segments
can't be merged.
e a look at the spellcheck component I have, let me know and I'll pass it
along. I may just have to stop using it and go back to a separate request for our spellchecking.
Thanks.
Doug
Doug Steigerwald wrote:
The user that runs our apps is configured to allow 65536 open files in
limits.conf. Sh
The user that runs our apps is configured to allow 65536 open files in limits.conf. Shouldn't even
come close to that number. Solr is the only app we have running on these machines as our app user.
We hit the same type of issue when we had our mergeFactor set to 40 for all of our indexes. We
We just started hitting a FileNotFoundException for no real apparent reason for
both our regular
index and our spellchecker index, and only a few minute after we restarted Solr. I did some
searching and didn't find much that helped.
We started to do some load testing, and after about 10 minut
Is there any way to get the logs to stderr/stdout to be in 24hour time?
Thanks.
Doug
We're on r614955.
On Wednesday 19 March 2008 11:33:36 am muddassir hasan wrote:
> Hi Doug,
>
> Please let me know on which solr revision you applied patch.
>
> Thanks.
> M. Hasan
>
> Doug Steigerwald <[EMAIL PROTECTED]> wrote: The latest
> one won't a
The latest one won't apply to the trunk because it's too old. It hasn't been updated to match
changes made to Solr since mid-February. One of the things I know has to change is that in
CollapseComponent->prepare/process, the parameters need to change to just accept a ResponseBuilder.
Other th
Came in this morning to find some alerts that the admin interface has basically
died. Everything
was fine until about 4am. No updates or queries going on at that time (this is a QA machine).
Anyone know why it might die like this?
Solr 1.3 trunk build from Jan 23rd, 4GB heap size, 4x3.2GHz X
',
'q'=>'*:*',
'facet'=>'true',
'highlight'=>'true'}},
'response'=>{'numFound'=>0,'start'=>0,'docs'=>[]
},
'facet_counts'=>{
'
'Comedy'=>11,
'Suspense/Thriller'=>11,
'SciFi/Fantasy'=>5,
'Animation'=>4,
'Documentary'=>4,
'Family'=>3,
'Horror'=>3,
'Musical'=&
hecked in your fix.
This was a recent bug... writing of SolrDocument was recently added
and is not touched by normal code paths, except for distributed
search.
-Yonik
On Wed, Mar 5, 2008 at 9:29 AM, Doug Steigerwald
<[EMAIL PROTECTED]> wrote:
We're using localsolr and the RubyRespon
Sweet. Thanks.
Doug
Yonik Seeley wrote:
Thanks Doug, I just checked in your fix.
This was a recent bug... writing of SolrDocument was recently added
and is not touched by normal code paths, except for distributed
search.
-Yonik
On Wed, Mar 5, 2008 at 9:29 AM, Doug Steigerwald
<[EM
patch:
'class'=>['showtime'],
'genre'=>['Drama',
'Suspsense/Triller'],
Has anyone come across an issue like this? Is this fixed in a newer build of Solr? It looks like
we'd still need this patch even in a build of the solr trunk from yesterday, but maybe not.
--
Doug Steigerwald
Software Developer
McClatchy Interactive
[EMAIL PROTECTED]
919.861.1287
YAML
libraries out there either.
We're not actually using it, since it was just a small proof of concept type of project, but is this
anything people might be interested in?
--
Doug Steigerwald
Posted our patches if anyone wants to take a look:
https://issues.apache.org/jira/browse/SOLR-433
Small change to core.RunExecutableListener and all the changes to the shell
scripts.
All these scripts seem to run fine on RHEL-3 and RHEL-5.1 servers.
doug
Doug Steigerwald wrote:
Sure. I
could use too, and so may have some cycles to work on it. I hate to
replicate the work if you already have something that is more or less
working. A half baked patch is better than no patch.
-Grant
On Feb 15, 2008, at 12:45 PM, Doug Steigerwald wrote:
That unfortunately got pushed asi
Sure. I'll try to post it today or tomorrow.
Doug Steigerwald
Software Developer
McClatchy Interactive
[EMAIL PROTECTED]
919.861.1287
Otis Gospodnetic wrote:
Hey Doug,
You have multicore/spellcheck replication going already? We have been working
on the replication for multicore. S
ch we have working quite well in QA
right now).
Doug Steigerwald
Software Developer
McClatchy Interactive
[EMAIL PROTECTED]
919.861.1287
oleg_gnatovskiy wrote:
dsteiger wrote:
I've got a couple search components for automatic spell correction that
I've been working on.
I've c
We don't always want to use the dismax handler in our setup.
Doug
Yonik Seeley wrote:
On Jan 21, 2008 9:06 PM, Doug Steigerwald
<[EMAIL PROTECTED]> wrote:
We've found a way to work around it. In our search components, we're doing
something like:
defT
er wrote:
On Jan 21, 2008 10:23 AM, Doug Steigerwald
<[EMAIL PROTECTED]> wrote:
Is there any support for DisMax (or any search request handlers) in search
components, or is that
something that still needs to be done? It seems like it isn't supported at the
moment.
I was curious
Is there any support for DisMax (or any search request handlers) in search components, or is that
something that still needs to be done? It seems like it isn't supported at the moment.
We want to be able to use a field collapsing component
(https://issues.apache.org/jira/browse/SOLR-236), but
I've got a couple search components for automatic spell correction that I've
been working on.
I've converted most of the SpellCheckerRequestHandler to a search component (hopefully will throw a
patch out soon for this). Then another search component that will do auto correction for a query if
at
org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:1004)
at org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:983)
Any ideas?
doug
Doug Steigerwald wrote:
It's in the index. Can see it with a query: q=word:blackjack
And in luke: −
29
The ac
t the spellchecker 100% (not looking at its source now). I'd peek
at the index with Luke (Luke I trust :)) and see if that term is really there
first.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message ----
From: Doug Steigerwald <[EMAIL PROTECT
Having another weird spell checker index issue. Starting off from a clean index and spell check
index, I'll index everything in example/exampledocs. On the first rebuild of the spellchecker index
using the query below says the word 'blackjack' exists in the spellchecker index. Great, no proble
Lately I've been having issues with the spellchecker failing to properly rebuild my spell index. I
used to be able to delete the spell directory and reload the core and build the index fine if it
ever crapped out, but now I can't even build it.
java.io.FileNotFoundException: /home/dsteiger/sol
nto a QueryComponent to work.
I don't think anyone has tackled that yet...
ryan
Doug Steigerwald wrote:
Modifying the patch to apply. StandardRequestHandler and
DisMaxRequestHandler were changed a lot in mid-November and I've been
having a hard time figuring out where the changes should be re
it to work once it is
applied?
-Grant
On Jan 3, 2008, at 8:52 AM, Doug Steigerwald wrote:
Being able to collapse multiple documents into one result with Solr is
a big deal for us here. Has anyone been able to get field collapsing
(http://issues.apache.org/jira/browse/SOLR-236) to patch to a r
Being able to collapse multiple documents into one result with Solr is a big deal for us here. Has
anyone been able to get field collapsing (http://issues.apache.org/jira/browse/SOLR-236) to patch to
a recent checkout of Solr? I've been unsuccessful so far in trying to modify the latest patch t
Is it going to be possible (soon) to register new Solr cores on the fly? I know the LOAD action is
yet to be implemented, but will that let you create new cores that are not listed in the
multicore.xml? We're occasionally going to have to create new cores and would like to not have to
stop/sta
Has anyone done any work on this?
https://issues.apache.org/jira/browse/SOLR-433
Thanks.
Doug
Ryan McKinley wrote:
OG: Yes, I think that makes sense - distribute everything for a given
core, not just its index. And the spellchecker could then also have
its data dir (and only index/ undern
Thanks. We're probably not going to be sending huge batches of documents very often, so I'll just
try a persistent connection and hopefully performance won't be an issue. With our document size, I
was posting around 300+ docs/s, so anything reasonably close to that will be good. Historically
We often have data that isn't generated by us going into our search. Sometimes there's a field that
shouldn't be multiValued, but the data comes in with multiple fields of the same name in a single
document.
Is there any way to continue processing other documents in a file even if one document
Not sure if this got through earlier, pine messed up...
Has anyone implemented any sort of geographic searching for Solr? I've
found Local Lucene
(http://www.nsshutdown.com/projects/lucene/whitepaper/locallucene.htm) by
Patrick O'Leary and there is another project in his CVS called Local Solr
(h
60 matches
Mail list logo