Hello, all:
My configurations works nicely with solr 4.4. I am encountering a configuration
error when I try to upgrade from 4.4 to 4.6. All I did was the following:
a) Replace the 4.4 solr.war file with the 4.6 solr.war in the tomcat/lib
folder. I am using version 6.0.36 of tomcat.
b) I
--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com
9. mai 2013 kl. 01:54 skrev William Pierce evalsi...@hotmail.com:
The reason I placed the solr.war in tomcat/lib was -- I guess -- because
that's way I had always done it since 1.3 days. Our tomcat instance(s)
run nothing
Hi,
I have gotten solr 4.3 up and running on tomcat7/windows7. I have added the
two dataimport handler jars (found in the dist folder of my solr 4.3 download)
to the tomcat/lib folder (where I also placed the solr.war).
Then I added the following line to my solrconfig.xml:
requestHandler
. Have you tried putting those jars
somewhere else and using lib directive in solrconfig.xml instead to
point to them?
Regards,
Alex.
On Wed, May 8, 2013 at 2:07 PM, William Pierce evalsi...@hotmail.com
wrote:
I have gotten solr 4.3 up and running on tomcat7/windows7. I have added
the two
-home/lib or instanceDir/lib?
--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com
8. mai 2013 kl. 21:15 skrev William Pierce evalsi...@hotmail.com:
Thanks, Alex. I have tried placing the jars in a folder under
solrhome/lib or under the instanceDir/lib with appropriate
Two suggestions: a) Noticed that your dih spec in the solrconfig.xml seems
to to refer to db-data-config.xml but you said that your file was
db-config.xml. You may want to check this to make sure that your file
names are correct. b) what does your log say when you ran the import
process?
Our setup on ec2 is as follows:
a) mysql master on ebs volume.
b) solr master on its own ebs volume
c) solr slaves do not use ebs -- but rather use the ephemeral instance
stores. There is a small period of time where the solr slave has to re-sync
the data from the solr master.
Cheers,
Bill
I have used solr extensively for our sites (and for the clients I work
with). I think it is great! If you do an item-by-item feature list
comparison, I think you will find that solr stacks up quite well. And the
price, of course, cannot be beat!
However, there are a few intangibles that
, Dec 8, 2009 at 3:42 AM, William Pierce evalsi...@hotmail.com
wrote:
Just to make doubly sure, per tck's suggestion, I went in and
explicitly
added in the port in the masterurl so that it now reads:
http://localhost:8080/postingsmaster/replication
Still getting the same exception...
I am
Folks:
I am seeing this exception in my logs that is causing my replication to fail.
I start with a clean slate (empty data directory). I index the data on the
postingsmaster using the dataimport handler and it succeeds. When the
replication slave attempts to replicate it encounters this
PM
To: solr-user@lucene.apache.org
Subject: Re: Exception encountered during replication on slaveAny clues?
are you missing the port number in the master's url ?
-tck
On Mon, Dec 7, 2009 at 4:44 PM, William Pierce
evalsi...@hotmail.comwrote:
Folks:
I am seeing this exception in my logs
.
Thanks,
- Bill
--
From: William Pierce evalsi...@hotmail.com
Sent: Monday, December 07, 2009 2:03 PM
To: solr-user@lucene.apache.org
Subject: Re: Exception encountered during replication on slaveAny clues?
tck,
thanks for your quick response. I am
Have you gone through the solr tomcat wiki?
http://wiki.apache.org/solr/SolrTomcat
I found this very helpful when I did our solr installation on tomcat.
- Bill
--
From: Jill Han jill@alverno.edu
Sent: Friday, December 04, 2009 8:54 AM
To:
Folks:
In my db I currently have fields that represent bitmasks. Thus, for example,
a value of the mask of 48 might represent an undergraduate (value = 16) and
graduate (value = 32). Currently, the corresponding field in solr is a
multi-valued string field called EdLevel which will have
?
remove the lst name=slave section from your solrconfig. It should be
fine
On Tue, Dec 1, 2009 at 6:59 AM, William Pierce evalsi...@hotmail.com
wrote:
Hi, Joe:
I tried with the fetchIndex all lower-cased, and still the same result.
What do you specify for masterUrl in the solrconfig.xml
Folks:
Reading the wiki, I saw the following statement:
Force a fetchindex on slave from master command :
http://slave_host:port/solr/replication?command=fetchindex
It is possible to pass on extra attribute 'masterUrl' or other attributes
like 'compression' (or any other parameter which
Folks:
I do not want to hardcode the masterUrl in the solrconfig.xml of my slave. If
the masterUrl tag is missing from the config file, I am getting an exception in
solr saying that the masterUrl is required. So I set it to some dummy value,
comment out the poll interval element, and issue
Folks:
Sorry for this repost! It looks like this email went out twice
Thanks,
- Bill
--
From: William Pierce evalsi...@hotmail.com
Sent: Monday, November 30, 2009 1:47 PM
To: solr-user@lucene.apache.org
Subject: How to avoid hardcoding
--
From: William Pierce evalsi...@hotmail.com
Sent: Monday, November 30, 2009 1:47 PM
To: solr-user@lucene.apache.org
Subject: How to avoid hardcoding masterUrl in slave solrconfig.xml?
Folks:
I do not want to hardcode the masterUrl in the solrconfig.xml
Folks:
For those of your experienced linux-solr hands, I am seeking recommendations
for which file system you think would work best with solr. We are currently
running with Ubuntu 9.04 on an amazon ec2 instance. The default file system I
think is ext3.
I am of course seeking, of course,
Folks:
I am encountering an internal exception running solr on an Ubuntu 9.04 box,
running tomcat 6. I have deposited the solr nightly bits (as of October 7)
into the folder: /usr/share/tomcat6/lib
The exception from the log says:
Nov 9, 2009 8:26:13 PM
Folks:
I am encountering an internal exception running solr on an Ubuntu 9.04 box,
running tomcat 6. I have deposited the solr nightly bits (as of October 7)
into the folder: /usr/share/tomcat6/lib
The exception from the log says:
Nov 9, 2009 8:26:13 PM
Sorry...folks...I saw that there were two copies sent outBeen having
some email snafus at my end...so apologize in advance for the duplicate
email
- Bill
--
From: William Pierce evalsi...@hotmail.com
Sent: Monday, November 09, 2009 12:49
)
at
org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:108)
at
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3800)
Cheers,
- Bill
--
From: William Pierce evalsi...@hotmail.com
Sent: Monday, November 09, 2009 12:49 PM
To: solr-user
I'd recommend two ways: The way I do it in my app is that I have written a
MySql function to transform the column as part of the select statement. In
this approach, your select query would like so:
select col1, col2, col3, spPrettyPrintCategory(category) as X, col4,
col5, from table
Folks:
My db contains approx 6M records -- on average each is approx 1K bytes. When
I use the DIH, I reliably get an OOM exception. The machine has 4 GB ram,
my tomcat is set to use max heap of 2G.
The option of increasing memory is not tenable coz as the number of documents
grows I
-notes.html,
and looking at the java code, it appears that DIH use PreparedStatement in
the JdbcDataSource.
I set the batchsize parameter to -1 and it solved my problem.
Regards.
Gilbert.
William Pierce a écrit :
Folks:
My db contains approx 6M records -- on average each is approx 1K bytes.
When I
Folks:
If I issue two optimize / requests with no intervening changes to the index,
will the second optimize request be smart enough to not do anything?
Thanks,
Bill
Congratulations on thisWhat dotnet library did you use? We are also
using solr in our windows2003/C# environment but currently simply use HTTP
to query and the Dataimport handler to update the indices...
- Bill
--
From: Robert Petersen
in the entity? if you set
logLevel=severe it should definitely be printed
2009/10/17 Noble Paul നോബിള് नोब्ळ् noble.p...@corp.aol.com:
It is strange that LogTransformer did not log the data. .
On Fri, Oct 16, 2009 at 5:54 PM, William Pierce evalsi...@hotmail.com
wrote:
Folks:
Continuing my saga
,LogTransformer
logTemplate=${post}
query= select Id, a, b, c, IndexingStatus from prod_table
where (IndexingStatus = 1 or IndexingStatus = 4)
this should print out the entire row after the transformations
On Fri, Oct 16, 2009 at 3:04 AM, William Pierce evalsi...@hotmail.com
wrote
16, 2009 1:16 PM
To: solr-user@lucene.apache.org
Subject: Re: Using DIH's special commandsHelp needed
On Fri, Oct 16, 2009 at 5:54 PM, William Pierce
evalsi...@hotmail.comwrote:
Folks:
Continuing my saga with DIH and use of its special commands. I have
verified that the script
think this is a valid use case and it might be a good idea
to
support it in someway.
I will post this thread on the dev-mailing list to seek opinion.
Cheers
Avlesh
On Wed, Oct 14, 2009 at 11:39 PM, William Pierce evalsi...@hotmail.com
wrote:
Thanks, Avlesh. Yes, I did take a look
Folks:
I see in the DIH wiki that there are special commands which according to the
wiki
Special commands can be given to DIH by adding certain variables to the row
returned by any of the components .
In my use case, my db contains rows that are marked PendingDelete. How do
I use the
--
From: Shalin Shekhar Mangar shalinman...@gmail.com
Sent: Thursday, October 15, 2009 10:03 AM
To: solr-user@lucene.apache.org
Subject: Re: Using DIH's special commandsHelp needed
On Thu, Oct 15, 2009 at 6:25 PM, William Pierce
evalsi
15, 2009 11:03 AM
To: solr-user@lucene.apache.org
Subject: Re: Using DIH's special commandsHelp needed
On Thu, Oct 15, 2009 at 10:42 PM, William Pierce
evalsi...@hotmail.comwrote:
Thanks, Shalin. I am sorry if I phrased it incorrectly. Yes, I want
to
know how to delete documents
Folks:
I am pretty happy with DIH -- it seems to work very well for my situation.
Thanks!!!
The one issue I see has to do with the fact that I need to keep polling
url/dataimport to check if the data import completed successfully. I need
to know when/if the import is completed
handler...Is this possible?
Had a look at EventListeners in
DIH?http://wiki.apache.org/solr/DataImportHandler#EventListeners
Cheers
Avlesh
On Wed, Oct 14, 2009 at 11:21 PM, William Pierce
evalsi...@hotmail.comwrote:
Folks:
I am pretty happy with DIH -- it seems to work very well for my
Folks:
During query time, I want to dynamically compute a document score as
follows:
a) Take the SOLR score for the document -- call it S.
b) Lookup the business logic score for this document. Call it L.
c) Compute a new score T = func(S, L)
d) Return the documents sorted by T.
I
OopsMy bad! I didn't realize that by changing the subject line I was
still part of the thread whose subject I changed!
Sorry folks! Thanks, Hoss for pointing this out!
- Bill
--
From: Chris Hostetter hossman_luc...@fucit.org
Sent: Tuesday,
, and these numbers are for those
machines.
Good luck!
Lance Norskog
On Sat, Oct 10, 2009 at 5:57 PM, William Pierce evalsi...@hotmail.com
wrote:
Oh and one more thing...For historical reasons our apps run using msft
technologies, so using SolrJ would be next to impossible at the present
time
Folks:
I have a corpus of approx 6 M documents each of approx 4K bytes.
Currently, the way indexing is set up I read documents from a database and
issue solr post requests in batches (batches are set up so that the
maxPostSize of tomcat which is set to 2MB is adhered to). This means that
in
Oh and one more thing...For historical reasons our apps run using msft
technologies, so using SolrJ would be next to impossible at the present
time
Thanks in advance for your help!
-- Bill
--
From: William Pierce evalsi...@hotmail.com
Sent
Folks:
Are there good rules of thumb for when to optimize? We have a large index
consisting of approx 7M documents and we currently have it set to optimize
once a day. But sometimes there are very few changes that have been
committed during a day and it seems like a waste to optimize (esp.
Folks:
In our app we index approx 50 M documents every so often. One of the fields
in each document is called CompScore which is a score that our back-end
computes for each document. The computation of this score is heavy-weight
and is done only approximately once every few days.When
That is fantastic! Will the Java replication support be included in this
release?
Thanks,
- Bill
--
From: Ryan McKinley ryan...@gmail.com
Sent: Wednesday, January 07, 2009 11:42 AM
To: solr-user@lucene.apache.org
Subject: Re: Plans for 1.3.1?
Thanks, Ryan!
It is great that Solr replication (SOLR-561) is included in this release.
One thing I want to confirm (if Noble, Shalin et al) can help:
I had encountered an issue a while back (in late October I believe) with
using SOLR-561. I was getting an error (AlreadyClosedException)
Hi, Mark:
Thanks for the updateLooking forward to 1.4!
Cheers,
- Bill
--
From: Mark Miller markrmil...@gmail.com
Sent: Wednesday, January 07, 2009 4:48 PM
To: solr-user@lucene.apache.org
Subject: Re: Plans for 1.3.1?
William Pierce wrote
??? ?? [EMAIL PROTECTED]
Sent: Saturday, November 15, 2008 11:40 PM
To: solr-user@lucene.apache.org
Subject: Re: Fatal exception in solr 1.3+ replication
Is this issue visible for consistently ? I mean are you able to
reproduce this easily?
On Fri, Nov 14, 2008 at 11:15 PM, William Pierce
unexpectedly...I'll try to take a further look over the
weekend.
- Mark
William Pierce wrote:
Folks:
I am using the nightly build of 1.3 as of Oct 23 so as to use the
replication handler. I am running on windows 2003 server with tomcat
6.0.14. Everything was running fine until I noticed
, if/when a fix is discovered, you will probably be able to apply
just the fix to the revision your working with.
- Mark
William Pierce wrote:
Mark,
Thanks for your response --- I do appreciate all you volunteers working
to provide such a nice system!
Anyway, I will try the trunk bits
the refcount on the Directory hit 0? I can't find or duplicate
yet...
Trunk may actually still hide the issue (possibly), but something really
funky seems to have gone on and I can't find it yet. Do you have any
custom code interacting with solr?
- Mark
William Pierce wrote:
Mark,
Thanks
William Pierce wrote:
Trunk may actually still hide the issue (possibly), but something really
funky seems to have gone on and I can't find it yet. Do you have any
custom code interacting with solr?
None whatsoever...I am using out-of-the-box solr 1.3 (build of 10/23). I
am using my C# app to http
Folks:
I am using the nightly build of 1.3 as of Oct 23 so as to use the replication
handler. I am running on windows 2003 server with tomcat 6.0.14. Everything
was running fine until I noticed that certain updated records were not showing
up on the slave. Further investigation showed me
I am using tomcat 6.0.14 without any problems on windows 2003 R2 server. I
am also using the 1.3 patch (using the nightly build of 10/23) for
master-slave replication... That's been working great!
-- Bill
--
From: Otis Gospodnetic [EMAIL
feature
http://wiki.apache.org/solr/SolrReplication
On Thu, Oct 23, 2008 at 4:32 AM, William Pierce [EMAIL PROTECTED]
wrote:
Otis,
Yes, I had forgotten that Windows will not permit me to overwrite files
currently in use. So my copy scripts are failing. Windows will not
even
allow a rename
configuration
If you are using a nightly you can try the new SolrReplication feature
http://wiki.apache.org/solr/SolrReplication
On Thu, Oct 23, 2008 at 4:32 AM, William Pierce [EMAIL PROTECTED]
wrote:
Otis,
Yes, I had forgotten that Windows will not permit me to overwrite files
currently
it? Are you
deleting index files from the index dir on Q that are no longer in the
index dir on U?
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: William Pierce [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Sent: Wednesday, October 22, 2008
Folks:
I have an odd situation that I am hoping someone can shed light on.
I have a solr apps running under tomcat 6.0.14 (on a windows xp sp3
machine).
The app is declared in the tomcat config file as follows:
In file merchant.xml for the merchant app:
Context
Folks:
We are building a search capability into our web and plan to use Solr. While
we have the initial prototype version up and running on Solr 1.2, we are now
turning our attention to sizing/scalability.
Our app in brief: We get merchant sku files (in either xml/csv) which we
process
and *your* type of queries is by benchmarking.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: William Pierce [EMAIL PROTECTED]
To: solr-user@lucene.apache.org
Sent: Thursday, May 15, 2008 12:23:03 PM
Subject: Some advice on scalability
Folks
Hi,
I am having problems with Solr 1.2 running tomcat version 6.0.16 (I also tried
6.0.14 but same problems exist). Here is the situation: I have an ASP.net
application where I am trying to add and commit a single document to an
index. After I add the document and issue the commit / I can
?commit=true in the URL of the add command.
-Yonik
On Tue, May 13, 2008 at 9:31 AM, Alexander Ramos Jardim
[EMAIL PROTECTED] wrote:
Maybe a delay in commit? How may time elapsed between commits?
2008/5/13 William Pierce [EMAIL PROTECTED]:
Hi,
I am having problems with Solr 1.2 running
a separate commit/ _request_ after your
add, or putting a commit/ into the same request. Solr only supports
one command (add or commit, but not both) per request.
Erik
On May 13, 2008, at 10:36 AM, William Pierce wrote:
Thanks for the comments
The reason I am just adding one
64 matches
Mail list logo