Hi Chris,
Amazing Analysis !
I did actually not investigated the log, because I was first trying to get
more information from the user.
"We are running full import and delta import crons .
Fulll index once a day
delta index : every 10 mins
last night my index automatically deleted(numdocs=0).
Thanks . for replying ..
please find the data-config
On Thu, Jun 11, 2015 at 6:06 AM, Chris Hostetter
wrote:
>
> : The guys was using delta import anyway, so maybe the problem is
> : different and not related to the clean.
>
> that's not what the logs say.
>
> Here's what i see...
>
> Log beg
: The guys was using delta import anyway, so maybe the problem is
: different and not related to the clean.
that's not what the logs say.
Here's what i see...
Log begins with server startup @ "Jun 10, 2015 11:14:56 AM"
The DeletionPolicy for the "shopclue_prod" core is initialized at "Jun
10,
Just taking a look to the code :
"
if (requestParams.containsKey("clean")) {
clean = StrUtils.parseBool( (String) requestParams.get("clean"), true);
} else if (DataImporter.DELTA_IMPORT_CMD.equals(command) ||
DataImporter.IMPORT_CMD.equals(command)) {
clean = false;
} else {
clean = debug ?
I was only speaking about full import regarding the default of
clean=true. However, looking at the source code, it doesn't seem to
differentiate especially between a full and a delta in relation to the
default of clean=true, which would be pretty crappy. However, I'd need
to try it.
Upayavira
On
Wow, Upaya, I didn't know that clean was default=true in the delta import
as well!
I did know it was default in the full import, but I agree with you that
having a default to true for delta import is very dangerous !
But assuming the user was using the delta import so far, if cleaning every
time,
Let me answer in line, to get more info :
2015-06-10 10:59 GMT+01:00 Midas A :
> Hi Alessandro,
>
> Please find the answers inline and help me out to figure out this problem.
>
> 1) Solr version : *4.2.1*
> 2) Solr architecture :* Master -slave/ Replication with requestHandler*
>
>
Where happene
Note the clean= parameter to the DIH. It defaults to true. It will wipe
your index before it runs. Perhaps it succeeded at wiping, but failed to
connect to your database. Hence an empty DB?
clean=true is, IMO, a very dangerous default option.
Upayavira
On Wed, Jun 10, 2015, at 10:59 AM, Midas A
Hi Alessandro,
Please find the answers inline and help me out to figure out this problem.
1) Solr version : *4.2.1*
2) Solr architecture :* Master -slave/ Replication with requestHandler*
3) Kind of data source indexed : *Mysql *
4) What happened to the datasource ? any change in there ? : *No c
Let me try to help you, first of all I would like to encourage people to
post more information about their scenario than "This is my log, index
deleted, help me" :)
This kind of Info can be really useful :
1) Solr version
2) Solr architecture ( Solr Cloud ? Solr Cloud configuration ? Manual
Shard
we are indexing around 5 docs par 10 min .
On Thu, Jun 4, 2015 at 11:02 PM, Midas A wrote:
> Shwan,
>
> Please find the log . give me some sense what is happening
>
> On Thu, Jun 4, 2015 at 10:56 PM, Shawn Heisey wrote:
>
>> On 6/4/2015 11:12 AM, Midas A wrote:
>> > sorry Shawn ,
>> >
>> >
Shwan,
Please find the log . give me some sense what is happening
On Thu, Jun 4, 2015 at 10:56 PM, Shawn Heisey wrote:
> On 6/4/2015 11:12 AM, Midas A wrote:
> > sorry Shawn ,
> >
> > a) Total docs solr is handling is 3 million .
> > b) index size is only 5 GB
>
> If your total index size is on
On 6/4/2015 11:12 AM, Midas A wrote:
> sorry Shawn ,
>
> a) Total docs solr is handling is 3 million .
> b) index size is only 5 GB
If your total index size is only 5GB, then there should be no need for a
30GB heap. For that much index, I'd start with 4GB, and implement GC
tuning.
A high iowait
sorry Shawn ,
a) Total docs solr is handling is 3 million .
b) index size is only 5 GB
On Thu, Jun 4, 2015 at 9:35 PM, Shawn Heisey wrote:
> On 6/4/2015 7:38 AM, Midas A wrote:
> > On Thu, Jun 4, 2015 at 6:48 PM, Shawn Heisey
> wrote:
> >
> >> On 6/4/2015 5:15 AM, Midas A wrote:
> >>> I have
On 6/4/2015 7:38 AM, Midas A wrote:
> On Thu, Jun 4, 2015 at 6:48 PM, Shawn Heisey wrote:
>
>> On 6/4/2015 5:15 AM, Midas A wrote:
>>> I have some indexing issue . While indexing IOwait is high in solr server
>>> and load also.
>> My first suspect here is that you don't have enough RAM for your in
Hi shawn,
Please find comment in line.
On Thu, Jun 4, 2015 at 6:48 PM, Shawn Heisey wrote:
> On 6/4/2015 5:15 AM, Midas A wrote:
> > I have some indexing issue . While indexing IOwait is high in solr server
> > and load also.
>
> My first suspect here is that you don't have enough RAM for your
On 6/4/2015 5:15 AM, Midas A wrote:
> I have some indexing issue . While indexing IOwait is high in solr server
> and load also.
My first suspect here is that you don't have enough RAM for your index size.
* How many total docs is Solr handling (all cores)?
* What is the total size on disk of all
Hi Alessandro,
On Thu, Jun 4, 2015 at 5:19 PM, Alessandro Benedetti <
benedetti.ale...@gmail.com> wrote:
> Honestly your auto-commit configuration seems not alarming at all!
> Can you give me more details regarding :
>
> Load expected : currently it is 7- 15 should be below 1
> *[Abhishek] : s
Honestly your auto-commit configuration seems not alarming at all!
Can you give me more details regarding :
Load expected : currently it is 7- 15 should be below 1
What does this mean ? Without a unit of measure i find hard to understand
plain numbers :)
was expecting the number of documents per
Thanks Alessandro,
Please find the info inline .
Which version of Solr are you using : 4.2.1
- Architecture : Master -slave
Load expected : currently it is 7- 15 should be below 1
Indexing approach : Using DIH
When does your problem happens : we run delta import every 10 mins full
index onc
Thanks for replying below is commit frequency
6 false
60
On Thu, Jun 4, 2015 at 4:49 PM, Toke Eskildsen
wrote:
> On Thu, 2015-06-04 at 16:45 +0530, Midas A wrote:
> > I have some indexing issue . While indexing IOwait is high in solr server
> > and load also.
>
> Might be because
I think this mail is really poor in term of details.
Which version of Solr are you using ?
Architecture ?
Load expected ?
Indexing approach ?
When does your problem happens ?
More detail we give, easier will be to provide help.
Cheers
2015-06-04 12:19 GMT+01:00 Toke Eskildsen :
> On Thu, 2015-0
On Thu, 2015-06-04 at 16:45 +0530, Midas A wrote:
> I have some indexing issue . While indexing IOwait is high in solr server
> and load also.
Might be because you commit too frequently. How often do you do that?
- Toke Eskildsen, State and University Library, Denmark
That's exactly how I would expect WordDelimiterFilterFactory to
split up that input.
You really need to look at the analysis chain to understand what
happens here, simply saying the field "text" isn't enough. What I'm
looking for is the "..." definition.
In solr 3.6, for example, there's no wrot
You probably are using a "text" field which is tokenizing the input when
this data should probably be a "string" (or "text" with the
KeywordAnalyzer.)
-- Jack Krupansky
-Original Message-
From: zainu
Sent: Thursday, September 20, 2012 5:49 AM
To: solr-user@lucene.apache.org
Subject:
Not enough info to go on here, what is your fieldType?
But the first place to look is admin/analysis to see how the
text is tokenized.
Best
Erick
On Thu, Sep 20, 2012 at 5:49 AM, zainu wrote:
> Dear fellows,
> I have a field in solr with value '8E0061123-8E1'. Now when i seach '8E*',
> it does
thanx Eric, that was very helpfull
2010/2/12 Erik Hatcher
> That would be the problem then, I believe. Simply don't post a value to
> get the default value to work.
>
>Erik
>
>
> On Feb 12, 2010, at 10:18 AM, nabil rabhi wrote:
>
> yes, sometimes the document has postal_code with no va
That would be the problem then, I believe. Simply don't post a value
to get the default value to work.
Erik
On Feb 12, 2010, at 10:18 AM, nabil rabhi wrote:
yes, sometimes the document has postal_code with no values , i still
post it
to solr
2010/2/12 Erik Hatcher
When a documen
yes, sometimes the document has postal_code with no values , i still post it
to solr
2010/2/12 Erik Hatcher
> When a document has no value, are you still sending a postal_code field in
> your post to Solr? Seems like you are.
>
>Erik
>
>
> On Feb 12, 2010, at 8:12 AM, nabil rabhi wrote:
When a document has no value, are you still sending a postal_code
field in your post to Solr? Seems like you are.
Erik
On Feb 12, 2010, at 8:12 AM, nabil rabhi wrote:
in the schema.xml I have fileds with int type and default value
exp: stored="true"
default="0"/>
but when a docume
I changed the UniqueKey and it worked fine.thank you very much Nobel
2009/5/18 Noble Paul നോബിള് नोब्ळ्
> the problem is that your uniquekey may not be unique
>
> just remove the entry altogether
>
> On Mon, May 18, 2009 at 10:53 PM, jayakeerthi s
> wrote:
> > Hi Noble,
> > Many thanks for
Hi Noble,
Many thanks for the reply
Yes there is a UniqueKey in the Schema which is the ProductID.
I also tried PROD_ID. But no luck
same only one document seen after querying *:*
I have attached the Schema.xml used for your reference,please advise.
Thanks and regards,
Jay
2009/5/16 Noble Paul
check out if you have a uniqueKey in your schema. I there are
duplicates they are overwritten
On Sat, May 16, 2009 at 1:38 AM, jayakeerthi s wrote:
> I am using Solr for our application with JBoss Integration.
>
> I have managed to configure the indexing from Oralce db for 22 fields.Here
> is the
: I have two cores in different machines which are referring to the same data
directory.
this isn't really considered a supported configuration ... both solr
instances are going to try and "own" the directory for updating, and
unless you do somethign special to ensure only one has control you
34 matches
Mail list logo