Ypu should set a hearbeat and have the virtual IP setup for the active instance.
So in haresources you can set like this:
node1 IPaddr::10.2.0.11 drbddisk::r0
Filesystem::/dev/drbd0::/cluster/Solr::ext3::defaults,noatime httpd
Are you running active/active cluster or active/passive?
Francis
I use it in our env(Prod), it seems to working fine for years now only clean up
the snapshot, but not the index.
I added it to the cron that run once a day to clean up
-francis
-Original Message-
From: Feak, Todd [mailto:todd.f...@smss.sony.com]
Sent: Monday, October 05, 2009 2:34 PM
You also can increase the JVM HeapSize if you have enough physical memory, like
for example if you have 4GB physical, gives the JVM heapsize 2GB or 2.5GB.
Francis
-Original Message-
From: didier deshommes [mailto:dfdes...@gmail.com]
Sent: Thursday, September 24, 2009 3:32 PM
To:
I reduced the size of queryResultCache in solrconfig seems to fix the issue as
well.
!-- Maximum number of documents to cache for any entry in the
queryResultCache. --
queryResultMaxDocsCached200/queryResultMaxDocsCached
From 500
!-- Maximum number of documents to cache for any
Message-
From: Constantijn Visinescu [mailto:baeli...@gmail.com]
Sent: Wednesday, September 09, 2009 11:35 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solr 1.3
Just wondering, how much memory are you giving your JVM ?
On Thu, Sep 10, 2009 at 7:46 AM, Francis Yakin fya
[mailto:baeli...@gmail.com]
Sent: Wednesday, September 09, 2009 11:35 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solr 1.3
Just wondering, how much memory are you giving your JVM ?
On Thu, Sep 10, 2009 at 7:46 AM, Francis Yakin fya...@liquid.com wrote:
I am having OutOfMemory
Our slaves servers is having issue with the following error after we upgraded
to Solr 1.3.
Any suggestions?
Thanks
Francis
NFO: [] webapp=/solr path=/select/
params={q=(type:artist+AND+alphaArtistSort:forever+in+terror)} hits=1
status=0 QTime=1
SEVERE: java.lang.OutOfMemoryError:
I have the same situation now.
If I don't want to use http connection, so I need to use EmbeddedSolrServer
that what I think I need correct?
We have Master/slaves solr, the applications use slaves for search. The Master
only taking the new index from Database and slaves will pull the new index
at the same time... and why to share
I/O between SOLR and DB?
Diversify, lower risks, having SOLR and DB on same box is extremely
unsafe...
-Fuad
-Original Message-
From: Francis Yakin [mailto:fya...@liquid.com]
Sent: August-26-09 2:25 PM
To: 'solr-user@lucene.apache.org'
Subject: RE: SolrJ
to implement triggers
written in Java causing SOLR update on each row update (transactional); but
I haven't heard anyone uses stored procs in Java, too risky and slow, with
specific dependencies...
-Original Message-
From: Francis Yakin [mailto:fya...@liquid.com]
Sent: August-26-09 4:18 PM
...
-Original Message-
From: Francis Yakin [mailto:fya...@liquid.com]
Sent: August-26-09 4:18 PM
To: 'solr-user@lucene.apache.org'
Subject: RE: SolrJ and Solr web simultaneously?
We already opened port 80 from solr to DB so that's not the issue, but
httpd(port 80) is very flaky if there is firewall
As of right now when I installed and configure the Solr, I will get example
dir ( like /opt/apache-solr-1.3.0/example ).
How can I change that to something else, because example to me is not real?
Thanks
Francis
Any one has any inputs for this? I really appreciated.
Thanks
Francis
-Original Message-
From: Francis Yakin [mailto:fya...@liquid.com]
Sent: Wednesday, August 12, 2009 3:39 PM
To: 'solr-user@lucene.apache.org'
Subject: Example dir
As of right now when I installed and configure
Have anyone had an experience to setup the Solr Security?
http://wiki.apache.org/solr/SolrSecurity
I would like to implement using HTTP Authentication or using Path Based
Authentication.
So, in the webdefault.xml I set like the following:
security-constraint
web-resource-collection
Do you anyone the differences between these two?
From the schema.xml
We have:
fieldType name=text class=solr.TextField positionIncrementGap=100
analyzer type=index
tokenizer class=solr.WhitespaceTokenizerFactory/
filter class=solr.SynonymFilterFactory
I just upgraded our solr to 1.3.0
After I deployed the solr apps, I noticed there are:
Segments_2 and segments.gen and there are 3 folder spellchecker1, spellchecker2
and spellcheckerFile
What's these for? When I deleted them, I need bounce the apps again and it will
generate the new ones
Yes, the xml files are in complete add format.
This is my code:
#!/usr/bin/perl
if (($#ARGV + 1) = 0 ) {
print Usage: perl prod.pl dir \n\n;
exit(1);
}
## -- CHANGE accordingly
$timeout = 300;
$topdir = /opt/Test/xml-file/;
#$topdir = /opt/Test/;
$dir =
I also commit too many I guess, since we have 1000 folders, so each loop will
executed the load and commit.
So 1000 loops with 1000 commits. I think it will be help if I only commit once
after the 1000 loops completed.
Any inputs?
Thhanks
Francis
-Original Message-
From: Francis
: Using curl comparing with using WebService::Solr
On Fri, Jul 10, 2009 at 11:50 AM, Francis Yakin fya...@liquid.com wrote:
I also commit too many I guess, since we have 1000 folders, so each loop
will executed the load and commit.
So 1000 loops with 1000 commits. I think it will be help if I
I have about 1000 folders, each folder consist 2581 xml files. Total of xml
files is ~ 2.6 millions
I developed perl script, inside my script it's executed this cmd:
curl http://localhost:7001/solr/update --data-binary @0039000.xml -H
'Content-type:text/plain; charset=utf-8'
It tooks me
Subject: Re: Updating Solr index from XML files
If Perl is you choice:
http://search.cpan.org/~bricas/WebService-Solr-0.07/lib/WebService/Solr.pm
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user
:01 AM
To: Francis Yakin
Cc: solr-user@lucene.apache.org
Subject: Re: Is there any other way to load the index beside using http
connection?
On Mon, 6 Jul 2009 09:56:03 -0700
Francis Yakin fya...@liquid.com wrote:
Norberto,
Thanks, I think my questions is:
why not generate your SQL
With
curl
'http://localhost:8983/solr/update/csv?stream.file=/opt/apache-1.2.0/example/exampledocs/test.csvstream.contentType=text/plain;charset=utf-8'
No errors now.
But , how can I verify if the update happening?
Thanks
Francis
-Original Message-
From: Francis Yakin [mailto:fya
yeah, It works now.
How can I verify if the new CSV file get uploaded?
Thanks
Francis
-Original Message-
From: ysee...@gmail.com [mailto:ysee...@gmail.com] On Behalf Of Yonik Seeley
Sent: Tuesday, July 07, 2009 10:49 AM
To: solr-user@lucene.apache.org
Cc: Norberto Meijome
Subject: Re:
and update this to
the solr data/indexes.
What commands do I have to use, for example the xml file named /opt/test.xml ?
Thanks
Francis
-Original Message-
From: Norberto Meijome [mailto:numard...@gmail.com]
Sent: Sunday, July 05, 2009 3:57 AM
To: Francis Yakin
Cc: solr-user
I have the following curl cmd to update and doing commit to Solr ( I have 10
xml files just for testing)
curl http://solr00:7001/solr/update --data-binary @xml_Artist-100170.txt -H
'Content-type:text/plain; charset=utf-8'
curl http://solr00:7001/solr/update --data-binary
/test.csvstream.contentType=text/plain;charset=utf-8
-bash: stream.contentType=text/plain: No such file or directory
undefined field cat
What did I do wrong?
Francis
-Original Message-
From: Norberto Meijome [mailto:numard...@gmail.com]
Sent: Monday, July 06, 2009 11:01 AM
To: Francis Yakin
Cc: solr-user
Nitin
Francis Yakin wrote:
Ok, I have a CSV file(called it test.csv) from database.
When I tried to upload this file to solr using this cmd, I got
stream.contentType=text/plain: No such file or directory error
curl
http://localhost:8983/solr/update/csv?stream.file=/opt/apache-1.2.0
Have any one had experience creating a datasource for DIH to an Oracle Database?
Also, from the Solr side we are running weblogic and deploy the application
using weblogic.
I know in weblogic we can create a datasource that can connect to Oracle
database, has any one had experience with this?
that you are posting via
HTTP (vs embedded solr or DIH) aren't going to go away. But it's the simpler
approach without changing too much of your current setup.
-Original Message-
From: Norberto Meijome [mailto:numard...@gmail.com]
Sent: Sunday, July 05, 2009 3:57 AM
To: Francis Yakin
Cc: solr
://
$server:$port/$path/dataimport?command=full-importclean=trueoptimize=true
Hope this helps out some.
Cheers
//Marcus
On Sun, Jul 5, 2009 at 7:28 PM, Francis Yakin fya...@liquid.com wrote:
Norberto,
Yes, DIH is one of the option we think to use, but it's required 1.3.0 and
above and currently we
, July 02, 2009 3:01 AM
To: solr-user@lucene.apache.org
Cc: Francis Yakin
Subject: Re: Is there any other way to load the index beside using http
connection?
On Wed, 1 Jul 2009 15:07:12 -0700
Francis Yakin fya...@liquid.com wrote:
We have several thousands of xml files in database that we load
- you need to make sure the
versions of Lucene and Solr are compatible (use same jars), you use
the same Analyzers, and you create the appropriate 'schema' that Solr
understands.
-glen
2009/7/2 Francis Yakin fya...@liquid.com:
Glen,
Database we use is Oracle, I am not the database administrator
We have several thousands of xml files in database that we load it to solr
master
The Database uses http connection and transfer those files to solr master.
Solr then translate xml files to their lindex.
We are experiencing issue with close/open connection in the firewall and very
very
documents as csv data/file.
Finally, you can use EmbeddedSolrServer.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Wednesday, July 1, 2009 6:07
]. It is
faster than Solr, sometimes as much as an order of magnitude faster.
Disclosure: I am the author of LuSql
-Glen
http://zzzoot.blogspot.com/
[1]http://lab.cisti-icist.nrc-cnrc.gc.ca/cistilabswiki/index.php/LuSql
2009/7/1 Francis Yakin fya...@liquid.com:
We have several thousands of xml
/7/1 Francis Yakin fya...@liquid.com:
We have several thousands of xml files in database that we load it to solr
master
The Database uses http connection and transfer those files to solr master.
Solr then translate xml files to their lindex.
We are experiencing issue with close/open
EmbeddedSolrServer.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Wednesday, July 1, 2009 6:07:12 PM
Subject: Is there any other way to load
Thanks Noble!
This is only for version 1.3.0? We are running 1.2.0 currently.
Francis
-Original Message-
From: noble.p...@gmail.com [mailto:noble.p...@gmail.com] On Behalf Of Noble
Paul ??? ??
Sent: Wednesday, July 01, 2009 9:43 PM
To: solr-user@lucene.apache.org
Subject: Re:
in QA vs. PROD environments that isn't Solr-specific.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Saturday, June 20, 2009 2:18:07 AM
network speed, different
hardware...)
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Friday, June 19, 2009 10:39:48 PM
Subject: RE: Slowness
We are experiencing slowness during reloading/resubmitting index from Database
to the master.
We have two environments:
QA and Prod.
The slowness is happened only in Production but not in QA.
It only takes one hours to reload 2.5Mil indexes compare 5-6 hours to load the
same size of index
-Original Message-
From: Chris Hostetter [mailto:hossman_luc...@fucit.org]
Sent: Friday, June 19, 2009 5:49 PM
To: 'solr-user@lucene.apache.org'
Subject: RE: Java OutOfmemory error during autowarming
: Date: Mon, 1 Jun 2009 11:34:08 -0700
: From: Francis Yakin
: Subject: RE: Java OutOfmemory
to the DB the same on both machines
* are both the PROD and QA DB servers the same and are both DB instances the
same
...
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr
Can I transport the index from Solr 1.2 to Sol 1.3 without
resubmiting/reloading again from Database?
Francis
We are experiencing OutOfMemory error frequently on our slaves, this is the
error:
SEVERE: Error during auto-warming of
key:org.apache.solr.search.queryresult...@a8c6f867:java.lang.OutOfMemoryError:
allocLargeObjectOrArray - Object size: 5120080, Num elements: 1280015
...@r.email.ne.jp]
Sent: Wednesday, June 17, 2009 8:28 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solrslaves
Francis Yakin wrote:
We are experiencing OutOfMemory error frequently on our slaves, this is the
error:
SEVERE: Error during auto-warming
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
From: Francis Yakin fya...@liquid.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Wednesday, June 10, 2009 1:17:25 AM
Subject: Upgrading 1.2.0 to 1.3.0 solr
I am in process to upgrade our solr 1.2.0 to solr 1.3.0
/${ant.project.name}.war/
Koji
Francis Yakin wrote:
We are planning to upgrade solr 1.2.0 to 1.3.0
Under 1.3.0 - Which of war file that I need to use and deploy on my
application?
We are using weblogic.
There are two war files under
/opt//apache-solr-1.3.0/dist/apache-solr-1.3.0.war and under
/opt
Francis Yakin fya...@liquid.com
I know, but the FieldCache is not in the solrconfig.xml
-Original Message-
From: Yonik Seeley [mailto:ysee...@gmail.com]
Sent: Friday, May 29, 2009 10:47 AM
To: solr-user@lucene.apache.org
Subject: Re: Java OutOfmemory error during autowarming
On Fri
We are planning to upgrade solr 1.2.0 to 1.3.0
Under 1.3.0 - Which of war file that I need to use and deploy on my application?
We are using weblogic.
There are two war files under
/opt//apache-solr-1.3.0/dist/apache-solr-1.3.0.war and under
/opt/apache-solr-1.3.0/example/webapps/solr.war.
consumption.
-Yonik
http://www.lucidimagination.com
On Fri, May 29, 2009 at 1:02 AM, Francis Yakin fya...@liquid.com wrote:
During auto-warming of solr search on QueryResultKey, Our Production
solrslaves errors throw OutOfMemory error and application need to be bounced.
Here is the error logs
I know, but the FieldCache is not in the solrconfig.xml
-Original Message-
From: Yonik Seeley [mailto:ysee...@gmail.com]
Sent: Friday, May 29, 2009 10:47 AM
To: solr-user@lucene.apache.org
Subject: Re: Java OutOfmemory error during autowarming
On Fri, May 29, 2009 at 1:44 PM, Francis
I just upgraded from 1.2.0 to 1.3.0 of solr.
We have an existing data/index that I will be using from 1.2.0 to 1.3.0 and I
use the default solrconfig.xml that come from 1.3.0.
For some reason when I used solrconfig.xml from 1.2.0 it works and I can see
the index and data, but I used
Subject: Re: Solrconfig.xml
Is there an error in the logs?
On May 6, 2009, at 2:12 PM, Francis Yakin wrote:
I just upgraded from 1.2.0 to 1.3.0 of solr.
We have an existing data/index that I will be using from 1.2.0 to
1.3.0 and I use the default solrconfig.xml that come from 1.3.0.
For some
I am having frequent OutOfMemory error on our slaves server.
SEVERE: Error during auto-warming of
key:org.apache.solr.search.queryresult...@aca6b9cb:java.lang.OutOfMemoryError:
allocLargeObjectOrArray - Object size: 34279632, Num elements: 8569904
SEVERE: Error during auto-warming of
What's the best way to upgrade solr from 1.2.0 to 1.3.0 ?
We have the current index that our users search running on 1.2.0 Solr version.
We would like to upgrade it to 1.3.0?
We have Master/Slaves env.
What's the best way to upgrade it without affecting the search? Do we need to
do it on
big are your caches? Please paste the relevant part of the config.
Which of your fields do you sort by? Paste definitions of those fields from
schema.xml, too.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: Francis Yakin fya...@liquid.com
58 matches
Mail list logo