As some people have mentioned here on this mailing lists, the solr 1.3
distribution scripts (snappuller / shooter) etc do not work on windows.
Some have indicated that it might be possible to use cygwin but I have
doubts. So unfortunately, windows users suffers with regard to replication
Hello,
for this you can simply use the nifty date functions supplied by SOLR
(given that you have indexed your fields with the solr Date field.
If I understand you correctly, you can achieve what you want with the
following union query:
displayStartDate:[* TO NOW] AND displayEndDate:[NOW
Hi,
I'm trying to highlight based on a (multivalued) field (prefix2) that has
(among other things) a EdgeNGramFilterFactory defined.
highlighting doesn't increment the start-position of the highlighted
portion, so in other words the highlighted portion is always the beginning
of the field.
- It should be possible to specify dataDir directly for a core in solr.xml
(over and above specifying it as a variable). It should also be possible to
pass the dataDir as a request parameter while creating a core through the
REST API.
- A simple scenario which requires this feature is when the
Yes...!! you can search for phrases with wild cards.
You dont have a direct support for it.. but u can achieve like the
following...
User input: Solr we
Query should be: (name:Solr AND (name:we* OR name:we)) OR name:Solr we
The query builder parses the original input and builds one that
Anything sent with delete query will be deleted. It doesnt give u the details
of the deleted records.
For example, if u send a command like
deleteid20070424150841/id/delete it will delete the record with id
20070424150841 but not give u the record details if it is already deleted.
We need to
hi,
i am new to apache solr.
I need to create a custom request handler class. So i create a new one
and changed the solr-config.xml file as,
requestHandler name=/select class=solr.my.MyCustomHandler
lst name=defaults
str name=echoParamsexplicit/str
str
Thanks for the info. Just FYI, I've decided to retrofit the 1.3
DataImportHandler with the JDBC driver params functionality to get us
around the OOM error problem with as few changes as possible.
kevin
On 11 Jun 2009, at 14:42, Shalin Shekhar Mangar wrote:
On Thu, Jun 11, 2009 at 6:42
you can just drop in the new JdbcDataSource.java into the 1.3 release
(and build it) and it should be just fine.
On Fri, Jun 12, 2009 at 5:55 PM, Kevin Lloydkll...@lulu.com wrote:
Thanks for the info. Just FYI, I've decided to retrofit the 1.3
DataImportHandler with the JDBC driver params
is there any error on the console?
On Fri, Jun 12, 2009 at 4:26 PM, Noornoo...@opentechindia.com wrote:
hi,
i am new to apache solr.
I need to create a custom request handler class. So i create a new one and
changed the solr-config.xml file as,
requestHandler name=/select
Yes,
Nullpointer Exception. on the line
SolrCore coreToRequest = coreContainer.getCore(core2);
Noble Paul ??? ?? wrote:
is there any error on the console?
On Fri, Jun 12, 2009 at 4:26 PM, Noornoo...@opentechindia.com wrote:
hi,
i am new to apache solr.
I need to create a custom
Hi,
Is possible to identify docId of document where occurred matching in
specific Term or QueryTerm ?
For example: I have a document with some fields and my query possesss the
Query for each field. I need to know the docIds when the QueryTermX finds
value. I know that I can verify if
Hi,
Sorry for being late to the party, let me try to clear some doubts about
Carrot2.
Do you know under what circumstances or application should we cluster the
whole corpus of documents vs just the search results?
I think it depends on what you're trying to achieve. If you'd like to give
the
Michael Ludwig schrieb:
Martin Davidsson schrieb:
I've tried to read up on how to decide, when writing a query, what
criteria goes in the q parameter and what goes in the fq parameter,
to achieve optimal performance. Is there [...] some kind of rule of
thumb to help me decide how to split
Thank you very much. I will try using solr nightly build.
Thanks
Mahesh R
Aleksander M. Stensby wrote:
As some people have mentioned here on this mailing lists, the solr 1.3
distribution scripts (snappuller / shooter) etc do not work on windows.
Some have indicated that it might be
I solved this NullPointerException, by the following changes.
In java code:
public void handleRequestBody(SolrQueryRequest request,
SolrQueryResponse response) throws Exception {
SolrCore coreToRequest = request.getCore();//coreContainer.getCore(core2);
.
}
and in solr-config.xml:
On Thu, Jun 11, 2009 at 10:46 AM, Jacob Elderjel...@locamoda.com wrote:
Is there any way to get the number of deleted records from a delete request?
Nope. I avoided adding it initially because I thought it might get
difficult to calculate that data i the future.
That's now come true - Lucene
On Fri, Jun 12, 2009 at 7:09 PM, Michael Ludwig m...@as-guides.com wrote:
I've summarized what I've learnt about filter queries on this page:
http://wiki.apache.org/solr/FilterQueryGuidance
Wow! This is great! Thanks for taking the time to write this up Michael.
I've added a section on
On Fri, Jun 12, 2009 at 8:07 PM, noor noo...@opentechindia.com wrote:
requestHandler name=/select class=solr.my.MyCustomHandler
lst name=defaults
str name=echoParamsexplicit/str
str name=qtandem/str
str name=debugQuerytrue/str
/lst
/requestHandler
Now, my webapp runs fine by,
Hello,
I need to retrieve the stats of my index (using StatsComponent). It's not a
problem when my query is empty, but the stats are update according the
current search... and I need the stats of the whole index everytime.
I'm currently doing two request (one with empty keyword to get the stats,
Britske,
I'd have to dig, but there are a couple of JIRA issues in Lucene's JIRA (the
actual ngram code is part of Lucene) that have to do with ngram positions. I
have a feeling that may be the problem.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original
I have a solr index which is going to grow 3x in the near future. I'm
considering using distributed search and was contemplating what would
be the best approach to splitting the index. Since most of the
searches performed on the index are sorted by date descending, I'm
considering splitting the
Hi all!
I'm using Solr 1.3 and currently testing reindexing...
In my client app, i am sending 17494 requests to add documents... In 3
different scenarios:
a) not using threads
b) using 1 thread
c) using 2 threads
In scenario a), everything seems to work fine... In my client log, is
see
If I want to run the stable 1.3 release and the nightly build under the same
Tomcat instance, should that be configured as multiple solr applications, or
is there a different configuration to follow?
On Fri, Jun 12, 2009 at 10:28 PM, Garafola Timothy timgaraf...@gmail.comwrote:
So let's say I have 2 shards. The first shard has docs with creation
dates of this year. The Second shard contains documents from the
previous year. I run a solr query requesting 10 rows sorted by date
and get
On Fri, Jun 12, 2009 at 11:40 PM, Alexander Wallace a...@rwmotloc.com wrote:
Hi all!
I'm using Solr 1.3 and currently testing reindexing...
In my client app, i am sending 17494 requests to add documents... In 3
different scenarios:
a) not using threads
b) using 1 thread
c) using 2
I'm trying out the replication features on 1.4 (trunk) with multiple
indices using a setup based on the example multicore config.
The first time I tried it, (replicating through the admin web
interface), it worked fine. I was a little surprised that telling one
core to replicate caused both to
On Fri, Jun 12, 2009 at 7:09 PM, Michael Ludwig m...@as-guides.com wrote:
I've summarized what I've learnt about filter queries on this page:
http://wiki.apache.org/solr/FilterQueryGuidance
Wow! This is great! Thanks for taking the time to write this up Michael.
I've added a section on
Phil Hagelberg p...@hagelb.org writes:
My only guess as to what's going wrong here is that deleting the
coreN/data directory is not a good way to reset a core back to its
initial condition. Maybe there's a bit of state somewhere that's making
the slave think that it's already up-to-date with
Um, yes this works.
On Fri, Jun 12, 2009 at 11:12 AM, Jeff Rodenburg
jeff.rodenb...@gmail.comwrote:
If I want to run the stable 1.3 release and the nightly build under the
same Tomcat instance, should that be configured as multiple solr
applications, or is there a different configuration to
-Original Message-
From: Fergus McMenemie [mailto:fer...@twig.me.uk]
Sent: Friday, June 12, 2009 3:41 PM
To: solr-user@lucene.apache.org
Subject: Re: fq vs. q
On Fri, Jun 12, 2009 at 7:09 PM, Michael Ludwig m...@as-guides.com
wrote:
I've summarized what I've learnt about
Right after I sent the email I went on and checked for uniqueness of
documents...
In theory the were all supposed to be unique... But i've realized that
the platform I'm using to reindex, is delaying sending the requests,
this in combination with my reindexers reusing document fields (instead
Hi,
Has anyone successfully used localsolr and collapse together in Solr
1.4. I am getting two result-sets one from localsolr and other from
collapse. I need a merged result-set.
Any pointers ???
I am installing Solr 1.3.0, and currently have been trying to use Tomcat 5.5.
This hasn't been working so far for me, and I have been told (unofficially)
that my installation would go more smoothly if I were to use Tomcat 6. Does
anyone have experiencing with Solr 1.3 and Tomcat 5.5?
On Sat, Jun 13, 2009 at 1:25 AM, Phil Hagelberg p...@hagelb.org wrote:
OK, so I inserted some more documents into the master, and now
replication works. I get the feeling it may be due to this line in the
master's solrconfig.xml:
str name=replicateAftercommit/str
Now this is confusing
Shalin Shekhar Mangar shalinman...@gmail.com writes:
You are right. In Solr/Lucene, a commit exposes updates to searchers. So you
need to call commit on the master for the slave to pick up the changes.
Replicating changes from the master and then not exposing new documents to
searchers does
On Sat, Jun 13, 2009 at 2:25 AM, Mukerjee, Neiloy (Neil)
neil.muker...@alcatel-lucent.com wrote:
I am installing Solr 1.3.0, and currently have been trying to use Tomcat
5.5. This hasn't been working so far for me, and I have been told
(unofficially) that my installation would go more
Thanks, I'll check it out.
Otis Gospodnetic wrote:
Britske,
I'd have to dig, but there are a couple of JIRA issues in Lucene's JIRA
(the actual ngram code is part of Lucene) that have to do with ngram
positions. I have a feeling that may be the problem.
Otis
--
Sematext --
On Sat, Jun 13, 2009 at 1:36 AM, Ensdorf Ken ensd...@zoominfo.com wrote:
I ran into this very issue recently as we are using a freshness filter
for our data that can be 6//12/18 months etc. I discovered that even though
we were only indexing with day-level granularity, we were specifying the
On Fri, Jun 12, 2009 at 4:55 PM, Mukerjee, Neiloy
(Neil)neil.muker...@alcatel-lucent.com wrote:
I am installing Solr 1.3.0, and currently have been trying to use Tomcat 5.5.
This hasn't been working so far for me, and I have been told (unofficially)
that my installation would go more smoothly
That was exactly my issue... i changed my code to not reuse
document/fields and it is all good now!
Thanks for your support!
Shalin Shekhar Mangar wrote:
On Fri, Jun 12, 2009 at 11:40 PM, Alexander Wallace a...@rwmotloc.com wrote:
Hi all!
I'm using Solr 1.3 and currently testing
Hello,
I am storing items in an index. Each item has a comma separated list
of related items. Is it possible to bring back an item and all of its
related items in one query? If so how and how would you distinguish
between which one is the main item and which are the related.
Any help is
On Sat, Jun 13, 2009 at 2:44 AM, Phil Hagelbergp...@hagelb.org wrote:
Shalin Shekhar Mangar shalinman...@gmail.com writes:
You are right. In Solr/Lucene, a commit exposes updates to searchers. So you
need to call commit on the master for the slave to pick up the changes.
Replicating changes
Shalin Shekhar Mangar wrote:
On Fri, Jun 12, 2009 at 8:07 PM, noor noo...@opentechindia.com wrote:
requestHandler name=/select class=solr.my.MyCustomHandler
lst name=defaults
str name=echoParamsexplicit/str
str name=qtandem/str
str name=debugQuerytrue/str
/lst
/requestHandler
Now, my
44 matches
Mail list logo