Hi ,
I have setup solrcloud with solr 4.4. It has two tomcats with 2 solr
instances ( one in each tomcat).
I start zookeeper , and run the commands for linking the configuration
files with zookeeper.
After that, when i start tomcat, getting the belwo exception,
*Exception in Overseer main queue
So, if the match spans pages 4 and 5, what do you want returned? Page 4,
page 5, or both?
Regards,
Alex
On 28 Aug 2013 06:55, Атанас Атанасов atanaso...@gmail.com wrote:
Hello,
My name is Atanas Atanasov, I'm using SOLR 1.4/3.5/4.3 for an year and a
half and I'm really satisfied of what
Dear all,
My co-worker use UIMA with Solr 4.4.0.
But too slow UIMA with Solr.
I read source code solr/contirb/uima/.
Solr UIMA integration source code call repeatedly AEProvider.getAE().
Maybe it's create AnalysisEngine instance every request and every processText().
But Lucene code
Hi Jun,
I agree the AE (instead of the AEProvider) should be cached on the
UpdateRequestProcessor.
In previous revisions [1] it was cached directly by the BasicAEProvider so
there wasn't need of that in the UIMAUpdateRequestProcessor but, since that
has changed, I agree that should be done there
p.s.
see https://issues.apache.org/jira/browse/SOLR-5201
2013/8/29 Tommaso Teofili tommaso.teof...@gmail.com
Hi Jun,
I agree the AE (instead of the AEProvider) should be cached on the
UpdateRequestProcessor.
In previous revisions [1] it was cached directly by the BasicAEProvider so
there
Hi Tommaso,
Thanks!!
Jun Ohtani
On 2013/08/29, at 17:56, Tommaso Teofili tommaso.teof...@gmail.com wrote:
p.s.
see https://issues.apache.org/jira/browse/SOLR-5201
2013/8/29 Tommaso Teofili tommaso.teof...@gmail.com
Hi Jun,
I agree the AE (instead of the AEProvider)
Hi,
i am a beginner and i try to use EmbededSolrServer with ComplexPhrase plugin
...
i create bin/lib folder in solrhome and copy the ComplexPhrase.jar inside
i modify the solrconfig.xml
added
lib dir=../bin/lib regex=.*\.jar /
and
queryParser name=complexphrase
Hi ,
Check your configuration files uploaded into zookeeper is valid and no error
in config files uploaded.
I think due to this error, solr core will not be created.
Thanks,
Sathish
--
View this message in context:
Hi,
I have couple of usecase need to be implemented.
1. Out application is 24 X 7 , so we require search feature to be available
24 X 7.
How to add new field to live solr node , without brining down the solr
instance ?
2. How to additional shard to existing collection and re distribute
Hello!
You can upload new schema.xml (along with other configuration files)
and reload the collection using collections API
(http://wiki.apache.org/solr/SolrCloud#Managing_collections_via_the_Collections_API).
However you have remember that in order for the new field to be usable
documents needs
Hi there! :)
I have a question about replication process. After a slave downloaded the
new index files, then an automatic commit occures, which actualizes the
index on the slave. My problem is we have 4 slaves and 6 collections and we
have to wait all of the new collection until we release the
Are you sure the ComplexPhrase parser supports your version of solr?
On Thu, Aug 29, 2013 at 12:09 PM, elfu el...@yahoo.com wrote:
Hi,
i am a beginner and i try to use EmbededSolrServer with ComplexPhrase
plugin ...
i create bin/lib folder in solrhome and copy the ComplexPhrase.jar inside
Yeah, reality gets in the way of simple solutions a lot.
And making it even more fun you'd really want to only
bring up one node for each shard in the broken DC and
let that one be fully synched. Then bring up the replicas
in a controlled fashion so you didn't saturate the local
network with
Hello,
I'm trying to index documents with Data import handler and solrcloud at the
same time. (huge collection, need to make parallel indexing)
First I had a dih configuration whichs works with solr standalone.
(Indexing for two month every week)
I've transformed my configuration to cloudify
Currently i have sample data like this :
doc
int name=nameNice Dress/int
arr name=keyword
strbest cocktail dress/str
strplatform complete pumps/str
strplatform pumps/str
strslip dress/str
/arr
/doc
I used multiple value for keyword field.
case 1
defType:edismax
qf:keyword
q:cocktail
See:
https://wiki.apache.org/solr/Unsubscribing%20from%20mailing%20lists
-- Jack Krupansky
-Original Message-
From: veena rani
Sent: Thursday, August 29, 2013 12:18 AM
To: solr-user@lucene.apache.org
Subject: Re: why does a node switch state ?
Kindly stop me from solr mail chain.
Assuming you want both pages to match you need the text to be present on
both pages. Do you actually return/store text of the page in Solr? If so,
you can have that 'page' field store-only and have another field which is
index-only and into which you put all your matching logic. So, that
Yeah, you see this when the core could not be created. Check the logs to see if
you can find something more useful.
I ran into this again the other day - it's something we should fix. You see the
same thing in the UI when a core cannot be created and it gives you no hint
about the problem and
On Aug 28, 2013, at 8:59 AM, Erick Erickson erickerick...@gmail.com wrote:
When a replica discovers that
it's too far out of date, it does an old-style replication. IOW, the
tlog doesn't contain the entire delta. Eventually, the old-style
replications catch up to close enough and _then_ the
Hello,
I've been trying to get Solr to run with DataImportHandler. I've found
various issues and fixed them, but I'm still getting an error message,
and I can't find anything else to fix. Could someone please take a look
at my setup to see if I've done something wrong? When I go to the Solr
init failure usually means you had a bad configuration parameter. You need
to look for the last caused by in the stack trace and that should tell you
what the parameter problem was.
-- Jack Krupansky
-Original Message-
From: Brian Robinson
Sent: Thursday, August 29, 2013 10:43 AM
Thanks! It looks like I have several of those, I'll check those out.
Brian Robinson
Head of Development
Social Surge Media, Inc.
773.701.3194
On 8/29/2013 10:02 AM, Jack Krupansky wrote:
init failure usually means you had a bad configuration parameter.
You need to look for the last caused by in
Hi, this is my problem:
I have a local installation of apache-tomcat/solr. I would like to move it
to remote server. In remote server i have installed new instance of apache
tomcat, i have create new SOLR_HOME folder and i have copied my local
solr_home content in. I have copied solr.war in
On 8/29/2013 9:45 AM, Carmine Paternoster wrote:
Hi, this is my problem:
I have a local installation of apache-tomcat/solr. I would like to move it
to remote server. In remote server i have installed new instance of apache
tomcat, i have create new SOLR_HOME folder and i have copied my local
Thank you Shawn, but the logging is correctly configured, because the INFO
message logging is stamped, or not? Any other suggest?
Il giorno 29/ago/2013 18:01, Shawn Heisey s...@elyograg.org ha scritto:
On 8/29/2013 9:45 AM, Carmine Paternoster wrote:
Hi, this is my problem:
I have a local
Hi, I can't find anywhere good documentation of what syntax is allowed in Solr
4.4 regular expression searches. I can get regexes to work, but the same search
with a predefined character class (like \s) or a word boundary matcher (\b)
returns nothing. I am searching an untokenized field and
Here is a really different approach.
Make the two data centers one Solr Cloud cluster and use a third data center
(or EC2 region) for one additional Zookeeper node. When you lose a DC,
Zookeeper still functions.
There would be more traffic between datacenters.
wunder
On Aug 29, 2013, at 4:11
You probably just need to escape the backslashes with a backslash -
otherwise the query parser will treat your backslashes as escapes and remove
them. This is not unlike placing a regex in a Java string literal.
-- Jack Krupansky
-Original Message-
From: Hugh Cayless
Sent: Thursday,
Hi, I can't find anywhere good documentation of what syntax is allowed in Solr
4.4 regular expression searches. I can get regexes to work, but the same search
with a predefined character class (like \s) or a word boundary matcher (\b)
returns nothing. I am searching an untokenized field and
Now, if I write the phrase The car in search query I get result
Black car
Test a car
The car
A fast car
But I need to get The car on top:
How this to get?
Thanks!
--
View this message in context:
On 8/29/2013 10:09 AM, Carmine Paternoster wrote:
Thank you Shawn, but the logging is correctly configured, because the INFO
message logging is stamped, or not? Any other suggest?
I don't know what that INFO message means. Can you use a paste website
(http://apaste.info being one example) and
Someone really needs to test this with EC2 availability zones. I haven't
had the time, but I know other clustered NoSQL solutions like HBase and
Cassandra can deal with it.
Michael Della Bitta
Applications Developer
o: +1 646 532 3062 | c: +1 917 477 7906
appinions inc.
“The Science of
Tried that, I'm afraid. No joy. I'm trying to step a search through with a
debugger attached to see if I can tell why it's not acting right…
On Aug 29, 2013, at 12:23 , Jack Krupansky j...@basetechnology.com wrote:
You probably just need to escape the backslashes with a backslash - otherwise
I want tika to only index the content in div id=content.../div for the
field text. unfortunately it's indexing the hole page. Can't xpath do this?
data-config.xml:
dataConfig
dataSource type=BinFileDataSource name=data/
dataSource type=BinURLDataSource name=dataUrl/
Since we are on the topic. I noticed that the wiki's Tomcat set up is
horribly outdated and pretty much useless for the current Solr version.
On Thu, Aug 29, 2013 at 9:35 AM, Shawn Heisey s...@elyograg.org wrote:
On 8/29/2013 10:09 AM, Carmine Paternoster wrote:
Thank you Shawn, but the
To answer your original question, the full Java regex syntax is supported.
A space in a regex query would need to be escaped with backslash, otherwise
it ends a term.
What is a sample field value and the actual query you tried?
What are you using as a client? The client language could
: Hi, I can't find anywhere good documentation of what syntax is allowed
: in Solr 4.4 regular expression searches. I can get regexes to work, but
the docs on solr's query parser syntax should be pointing you here...
: You can upload new schema.xml (along with other configuration files)
: and reload the collection using collections API
:
(http://wiki.apache.org/solr/SolrCloud#Managing_collections_via_the_Collections_API).
: However you have remember that in order for the new field to be usable
: documents
Is it more ideal to run SolrCloud instances within Tomcat containers or
should they just be run via start.jar without a container?
On Wed, Aug 28, 2013 at 12:39 PM, Shawn Heisey s...@elyograg.org wrote:
On 8/28/2013 1:36 PM, Jared Griffith wrote:
We are using Java here. Are you saying that
Walter, yes we did consider this (and might be having a 3rd DC for other
reasons anyway), but 3 DCs also offers the possibility of running with 2
down and 1 up which ZK still can't handle :)
There is also a second advantage to keeping our clouds separate, they are
independent, which means if we
OK, I'm running into a roadblock again. The last caused by error in my
stack trace is
_Caused by: java.lang.ClassCastException: class
org.apache.solr.handler.dataimport.DataImportHandler
_Searching the web, I see that this can be caused by having the DIH jar
files loaded by more than one
On 8/29/2013 11:15 AM, Jared Griffith wrote:
Is it more ideal to run SolrCloud instances within Tomcat containers or
should they just be run via start.jar without a container?
The start.jar included in Solr *is* a container. Specifically, it's a
stripped down installation of Jetty. For Solr
bumping this one, any suggestions?
I am sure this is solrcloud 101 but I couldn't find documentation anywhere.
Thanks,
-Utkarsh
On Wed, Aug 28, 2013 at 2:37 PM, Utkarsh Sengar utkarsh2...@gmail.comwrote:
I have a 3 node solrcloud cluster with 3 shards for each collection/core.
At times when
The context for this is that I'm migrating an application from Solr 3.5 to 4.4.
We had regex search working (in kind of a hacky way), but since 4.x has regex
search support built in, I'm trying to switch to that. Some things work the way
I'd expect, some clearly don't. So my question was, in
Is it more ideal to run the Jetty containers as opposed to running Tomcat
with the Solr war?
On Thu, Aug 29, 2013 at 10:36 AM, Shawn Heisey s...@elyograg.org wrote:
On 8/29/2013 11:15 AM, Jared Griffith wrote:
Is it more ideal to run SolrCloud instances within Tomcat containers or
should
First, are you sure you have a functioning SolrCloud setup? It
looks from the error like you haven't pushed the config files up
to ZK. Take a look at:
http://wiki.apache.org/solr/SolrCloud#Command_Line_Util
You should be able to do a downconfig on the Solr configuration
files you uploaded to
Hi,
We've investigated a memory dump, which was taken after some frequent OOM
incidents.
The main issue we found was a lot of millions of LazyField instances,
taking ~2GB of memory, even though queries request about 10 small fields
only.
We've found that LazyDocument creates a LazyField object
Thanks for the full details -- being able to see exactly how the queries
are recieved parsed is important for rulling out simple things like
client side escaping (or lack of) and server side metacharacter handling
in the query parser.
: Some things work the way I'd expect, some clearly
On 8/29/2013 12:08 PM, Jared Griffith wrote:
Is it more ideal to run the Jetty containers as opposed to running Tomcat
with the Solr war?
If I answer yes to that question, it's not really the whole story.
Just like the vi vs. emacs battle, it can become almost a religious
debate. Having
: The main issue we found was a lot of millions of LazyField instances,
: taking ~2GB of memory, even though queries request about 10 small fields
: only.
which version of Solr are you using? there was a really bad bug with
lazyFieldLoading fixed in Solr 4.2.1 (SOLR-4589)
: We've found that
So it all depends on your implementation and server restrictions. I'm just
going to set it up with Tomcat to get it running correctly but I might
just go with the native jetty server down the road when this is for real.
On Thu, Aug 29, 2013 at 12:06 PM, Shawn Heisey s...@elyograg.org wrote:
Thanks Hoss.
1. We currently use Solr 4.3.0.
2. I understand this architecture of LazyFields, but i did not understand
why multiple LazyFields should be created for the multivalued field. You
can't load a part of them. If you request the field, you will get ALL of
its values. so 100 (or more)
Hello,
We're running solr 4.2.0 and recently converted to SolrCloud. We've got
16 cores, each with 1 shard. 3 zookeeper instances, 4 replicas of each
core. We're suddenly having trouble with very slow tomcat restarts
(15-45 minutes) and even when we can get a few replicas up, we aren't
On 8/29/2013 2:16 PM, Cat Bieber wrote:
We're running solr 4.2.0 and recently converted to SolrCloud. We've got
16 cores, each with 1 shard. 3 zookeeper instances, 4 replicas of each
core. We're suddenly having trouble with very slow tomcat restarts
(15-45 minutes) and even when we can get a few
Well if it is down, it means there is an error on that particular
core/instance of Solr, you would need to check the logs on that instance to
see what the underlying problem is, there is no one root cause.
How to recover: fix the underlying problem and restart that Solr instance?
:)
With the
I just ditched my Tomcat set up because I was having issues getting the
nodes to work correctly, so I just went with the native jetty server.
Everything just works.
On Thu, Aug 29, 2013 at 12:16 PM, Jared Griffith jgriff...@picsauditing.com
wrote:
So it all depends on your implementation and
We are getting the following error intermittently( two in two weeks
interval). The load on server seems to be usual.I see in the log that just
before the failure ( 4-5 mins) qtime was very high, normally those query
will be processed within 300 ms but before failure they took more than 100
secs.So
: 2. I understand this architecture of LazyFields, but i did not understand
: why multiple LazyFields should be created for the multivalued field. You
: can't load a part of them. If you request the field, you will get ALL of
: its values. so 100 (or more) placeholders are not necessary in this
You can see the supported syntax here:
http://lucene.apache.org/core/4_4_0/core/org/apache/lucene/util/automaton/RegExp.html.
On Aug 29, 2013, at 11:57 AM, Hugh Cayless, Ph.D. hugh.cayl...@duke.edu
wrote:
Hi, I can't find anywhere good documentation of what syntax is allowed in
Solr 4.4
Thanks Shawn. We'll give an upgrade a try and see if that helps.
-Cat
On 08/29/2013 04:32 PM, Shawn Heisey wrote:
On 8/29/2013 2:16 PM, Cat Bieber wrote:
We're running solr 4.2.0 and recently converted to SolrCloud. We've got
16 cores, each with 1 shard. 3 zookeeper instances, 4 replicas
OK, so I have set up 4 solr instances that are using remote Zookeeper (3)
servers to manage them. Do we send documents via the zookeeper instance(s)
or just via the solr instances?
On Thu, Aug 29, 2013 at 1:43 PM, Jared Griffith
jgriff...@picsauditing.comwrote:
I just ditched my Tomcat set up
On 8/29/2013 4:39 PM, Jared Griffith wrote:
OK, so I have set up 4 solr instances that are using remote Zookeeper (3)
servers to manage them. Do we send documents via the zookeeper instance(s)
or just via the solr instances?
If you are indexing directly via something like curl or another
OK, so to get our initial documents in, we would use the curl / java upload
calls as documented in the wiki. Then once we get it all plugged into our
application, we would use the SolrJ client and plug in zookeeper
information there, so that the application could then update and retrieve
data
On 8/29/2013 4:53 PM, Jared Griffith wrote:
OK, so to get our initial documents in, we would use the curl / java upload
calls as documented in the wiki. Then once we get it all plugged into our
application, we would use the SolrJ client and plug in zookeeper
information there, so that the
I suspect that you're having stopwords removed, in which case
there's no way to do what you want, you'd need to change
your analysis chain to keep stopwords, then probably boost
phrases pretty high.
Best
Erick
On Thu, Aug 29, 2013 at 12:00 PM, Vladimir viv81s...@gmail.com wrote:
Now, if I
Hmmm, I'd be glad to give you edit rights if you'd like to
update with your current experiences, just create yourself
a login and ping the list with your Wiki logon name and we'll
be happy to add you to the list.
Best
Erick
On Thu, Aug 29, 2013 at 12:54 PM, Jared Griffith
Hmm, ya learn something new every day, thanks for the correction.
On Thu, Aug 29, 2013 at 10:23 AM, Mark Miller markrmil...@gmail.com wrote:
On Aug 28, 2013, at 8:59 AM, Erick Erickson erickerick...@gmail.com
wrote:
When a replica discovers that
it's too far out of date, it does an
The other possibility is that you have old and new jars
configured in your classpath. So to answer where the files
you need to remove are, well, somewhere in your classpath
which isn't very informative, but is the best I can do.
Best
Erick
On Thu, Aug 29, 2013 at 1:30 PM, Brian Robinson
Cool, thanks.
On Thu, Aug 29, 2013 at 4:19 PM, Shawn Heisey s...@elyograg.org wrote:
On 8/29/2013 4:53 PM, Jared Griffith wrote:
OK, so to get our initial documents in, we would use the curl / java
upload
calls as documented in the wiki. Then once we get it all plugged into our
Shawn's link gives you a fairly concise version, there's a
longer treatment (if you can stay awake) here:
http://searchhub.org/2013/08/23/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/
FWIW,
Erick
On Thu, Aug 29, 2013 at 6:21 PM, Cat Bieber cbie...@techtarget.com wrote:
i know query a key word by
http://localhost:8983/solr/collection1/select?q=*solr*wt=jsonindent=true;
how to add/update records ?thanks
Kevin
The original document :
[backcolor=#ff][color=#33][font=Tahoma] the document content
[/font][/color][/backcolor]
I want to delete the ebb code in this document when solr index this document.
just left :
the document content.
--
vincent
On 30 August 2013 07:02, Kevin vivid.tju.d...@gmail.com wrote:
i know query a key word by
http://localhost:8983/solr/collection1/select?q=*solr*wt=jsonindent=true;
how to add/update records ?thanks
This is a very basic question, and you could probably gain from
starting with
73 matches
Mail list logo