elsma,
> wrote:
>
>> Thanks!
>>
>> Op do 4 feb. 2021 om 20:04 schreef Chris Hostetter <
>> hossman_luc...@fucit.org
>> >:
>>
>> >
>> > FWIW: that log message was added to branch_8x by 3c02c9197376 as part of
>> > SOLR-15052 ..
by 3c02c9197376 as part of
> > SOLR-15052 ... it's based on master commit 8505d4d416fd -- but that does
> > not add that same logging message ... so it definitely smells like a
> > mistake to me that 8x would add this INFO level log message that master
> > doesn't have.
>
Thanks!
Op do 4 feb. 2021 om 20:04 schreef Chris Hostetter :
>
> FWIW: that log message was added to branch_8x by 3c02c9197376 as part of
> SOLR-15052 ... it's based on master commit 8505d4d416fd -- but that does
> not add that same logging message ... so it definitely smells like
FWIW: that log message was added to branch_8x by 3c02c9197376 as part of
SOLR-15052 ... it's based on master commit 8505d4d416fd -- but that does
not add that same logging message ... so it definitely smells like a
mistake to me that 8x would add this INFO level log message that master
Hello all,
We upgraded some nodes to 8.8.0 and notice there is excessive logging on
INFO when some traffic/indexing is going on:
2021-02-04 11:42:48.535 INFO (qtp261748192-268) [c:data s:shard2
r:core_node4 x:data_shard2_replica_t2] o.a.s.c.c.ZkStateReader already
watching , added to s
-day-vulnerability
Fortunately the attack isn't succeeding because of SOLR-13971 fix, and instead
it is causing these errors. I'll fortify the Solr access.
On 1/7/21 11:02 AM, TK Solr wrote:
On the Admin UI's login screen, when the Logging tab is clicked, I see lines
like:
Time(Local) Level
On the Admin UI's login screen, when the Logging tab is clicked, I see lines
like:
Time(Local) Level Core Logger
Message
1/7/2021 ERROR x:mycoreloader
ResourceManager: unable to find resource 'custom.vm
: jeudi 27 août 2020 18:42
À : solr-user@lucene.apache.org
Objet : Solr Logging In JSON Format
Hello,
We want to receive Solr logs in DataDog. Configured and all good, but
the logs are ugly, not parsed and not really useful.
Anyone knows a way to send the logs from Solr in JSON format?
Thank
Hello,
We want to receive Solr logs in DataDog. Configured and all good, but
the logs are ugly, not parsed and not really useful.
Anyone knows a way to send the logs from Solr in JSON format?
Thank you.
*\.jar" />
>> > regex="solr-clustering-\d.*\.jar" />
>>
>> > regex=".*\.jar" />
>> > regex="solr-langid-\d.*\.jar" />
>>
>> > regex="solr-ltr-\d.*\.jar" />
>>
>> > regex="
Hello,
I have a solr (version 6.6.6) docker container running where a few logs are
being written to /var/log/messages. My log4j.properties has the rootLogger set
to file and console. I have attached it for reference. My question is why are
some logs still being written to /var/log/messages and
Hello,
I have a solr (version 6.6.6) docker container running where a few logs are
being written to /var/log/messages. My log4j.properties has the rootLogger set
to file and console. I have attached it for reference. My question is why are
some logs still being written to /var/log/messages and
>regex=".*\.jar" />
>regex="solr-clustering-\d.*\.jar" />
>
>regex=".*\.jar" />
>regex="solr-langid-\d.*\.jar" />
>
>regex="solr-ltr-\d.*\.jar" />
>
>regex=".*\.jar" />
"${sys:solr.log.dir}/solr_slow_requests.log"
>> filePattern="${sys:solr.log.dir}/solr_slow_requests.log.%i" >
>>
>>
>> %d{-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard}
>> %X{replica} %X{core}] %c{1.} %m%n
>>
>>
gt;
>
>
>
>
>
>
>
>
>
> additivity="false">
>
>
>
>
>
>
>
>
>
>
> For some reason it just stops logging anything. I only get the so
it just stops logging anything. I only get the solr_gc.log
and not the expected solr.log. I see an old thread mentioning the exact
same issue (except that solr.log isn't even created in my case) but it
wasn't resolved there. See here:
http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201809.mbox
@lucene.apache.org
Subject: Re: Log4J Logging to Http
Hi Florian,
I don’t know the answer to your specific question, but I would like to suggest
a different approach. Excuse me in advance, I usually hate suggesting different
approaches.
The reason why I suggest a different approach is because logging
), Michael Jung, Stefan Mailänder,
Frank Schmelzer
Chairman of the Supervisory Board: Ulrich Holzer
-Original Message-
From: Shawn Heisey
Sent: Donnerstag, 18. Juni 2020 04:22
To: solr-user@lucene.apache.org
Subject: Re: Log4J Logging to Http
On 6/17/2020 1:33 AM, Krönert Florian
Hi Florian,
I don’t know the answer to your specific question, but I would like to suggest
a different approach. Excuse me in advance, I usually hate suggesting different
approaches.
The reason why I suggest a different approach is because logging via HTTP can
be blocking a thread e.g. until
On 6/17/2020 1:33 AM, Krönert Florian wrote:
2020-06-17T07:06:55.121856339Z java.lang.NoClassDefFoundError: Failed to
initialize Apache Solr: Could not find necessary SLF4j logging jars. If
using Jetty, the SLF4j logging jars need to go in the jetty lib/ext
directory. For other containers
:06:55.121825039Z 2020-06-17
07:06:55.104:WARN:oejw.WebAppContext:main: Failed startup of context
o.e.j.w.WebAppContext@611df6e3{/solr,file:///opt/solr-8.3.1/server/solr-webapp/webapp/,UNAVAILABLE}{/opt/solr-8.3.1/server/solr-webapp/webapp}
2020-06-17T07:06:55.121856339Z java.lang.NoClassDefFoundE
that conhost.exe had
higher CPU usage than java.exe (Solr process). The total CPU usage was
around 25% (on 4 core machine).
It turned out that debug logging (enabled by -v command-line option) was
the root cause of performance issues. Solr was writing excessive amount of
log data to console and all the log
On 5/10/2019 4:26 PM, Oakley, Craig (NIH/NLM/NCBI) [C] wrote:
We are wanting to tweak the logging levels of our Solr 7.4 nodes to see what
might be helpful to add to the solr.log for debugging purposes.
In investigating what is available, however, I run /solr/admin/info/logging and I find
We are wanting to tweak the logging levels of our Solr 7.4 nodes to see what
might be helpful to add to the solr.log for debugging purposes.
In investigating what is available, however, I run /solr/admin/info/logging and
I find that there is little consistency in what logging settings
After looking into the source code there seams nothing in there for
error logging together with the request which produced the error.
I think there is a need for this to log the request along with the error.
Could be done at o.a.s.core.SolrCore.execute() where the INFO logging is also
located
Hi list,
logging in solr sounds easy but the problem is logging only errors
and the request which produced the error.
I want to log all 4xx and 5xx http and also solr ERROR.
My request_logs from jetty show nothing useful because of POST requests.
Only that a request got HTTP 4xx or 5xx from
I faced the same issue as jakob with solr-7.6.0, eclipse-2018-12 (4.10.0),
Java 1.8.0_191:
*Solution:*
In eclipse Run Configuration run-solr
remove "file:" from Argument
-Dlog4j.configurationFile="file:${workspace_loc:solr-7.6.0}/solr/server/resources/log4j2.xml"
--
Sent from:
The file:/// change was made in:
https://issues.apache.org/jira/browse/SOLR-12538, how to reconcile
these two?
On Sun, Sep 16, 2018 at 10:54 PM marcostocch...@gmail.com
wrote:
>
>
>
> On 2018/07/03 08:53:20, ja...@jafurrer.ch wrote:
> > Hi,
> >
> > I was intending to open an Issue in Jira when I
On 9/16/2018 3:05 PM, marcostocch...@gmail.com wrote:
I experienced the same issue on Windows 10 professional. I don't think the OS
version is important. The solr version might. I have solr-7.4.0.
The problem has been fixed in the source code and the next version of
Solr (7.5.0) won't
On 2018/07/03 08:53:20, ja...@jafurrer.ch wrote:
> Hi,
>
> I was intending to open an Issue in Jira when I read that I'm supposed
> to first contact this mailinglist.
>
> Problem description
> ==
>
> System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299
>
I did it success by remove 'file:'
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
to be functioning correctly and not producing the strange pathname shown
> in the error, and the same parameter syntax (with the file: prefix) is
> working correctly on Linux.
>
> Erick, the config in cloud-scripts logs to stderr rather than files.
> I'm all for moving it to res
fig in cloud-scripts logs to stderr rather than files.
I'm all for moving it to resources so we don't have to keep track of
logging config files in multiple locations, but it does need to be a
different config file specifically for command-line tools. Perhaps
log4j2-cli.xml as the filename?
This i
Jakob:
I don't have Windows so rely on people who do to vet changes. But I'm
working on https://issues.apache.org/jira/browse/SOLR-12008 which
seeks to get rid of the confusing number of log4j config files and
solr.cmd has a lot of paths to change. So if you do get agreement that
you should raise
Hi,
I was intending to open an Issue in Jira when I read that I'm supposed
to first contact this mailinglist.
Problem description
==
System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299
Steps to reproduce the problem:
1) Download solr-7.4.0.tgz
2) Unzip to
Thanks a lot for your inputs Alessandro and Mikhail.
@Alessandro, I tried with transaction log. But it was bit more of work to
get around( as it gets rolled over).
Hack I did is use of a proxy in between and Now I have more control.
Regards,
Govind
On Thu, Jun 14, 2018 at 7:32 PM Mikhail
You can enable DEBUG level for LogUpdateProcessorFactory category
https://github.com/apache/lucene-solr/blob/228a84fd6db3ef5fc1624d69e1c82a1f02c51352/solr/core/src/java/org/apache/solr/update/processor/LogUpdateProcessorFactory.java#L100
On Wed, Jun 13, 2018 at 5:00 PM, govind nitk wrote:
>
Isn't the Transaction Log what you are looking for ?
Read this good blog post as a reference :
https://lucidworks.com/2013/08/23/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/
Cheers
-
---
Alessandro Benedetti
Search Consultant, R Software Engineer,
Hi,
Is there any way to log all the data getting indexed to a particular core
only ?
Regards,
govind
this in the logging, but I can't determine which core the log comes from.
How can I tell which core is receiving the offending requests?
Whatever you are referring to is not here. The mailing list eats most
attachments -- they don't make it through.
If you edit server/etc/jetty.xml you'll find
I am migrating a good number of cores over to the latest instance of solr
(well, 7.1.0) installed locally. It is working well, but my code is
occasionally sending requests to search or index an old field that was replaced
in the schema.
I see this in the logging, but I can't determine which
+Business Media.
> ---
> Springer Science+Business Media Deutschland GmbH
> Registered Office: Berlin / Amtsgericht Berlin-Charlottenburg, HRB 152987 B
> Directors: Derk Haank, Martin Mos, Dr. Ulrich Vest
>
> ____
> Von: Walter Underwood
cht Berlin-Charlottenburg, HRB 152987 B
Directors: Derk Haank, Martin Mos, Dr. Ulrich Vest
Von: Walter Underwood <wun...@wunderwood.org>
Gesendet: Dienstag, 5. Dezember 2017 16:20
An: solr-user@lucene.apache.org
Betreff: Re: Logging in Solrcloud
In 6.5.1, t
In 6.5.1, the intra-cluster requests are POST, which makes them easy to
distinguish in the request logs. Also, the intra-cluster requests go to a
specific core instead of to the collection. So we use the request logs and grep
out the GET lines.
We are considering fronting every Solr process
To be more precisely and provide some more details, i tried to simplify the
problem by using the Solr-examples that were delivered with the solr
So i started bin/solr -e cloud, using 2 nodes, 2 shards and replication of 2.
To understand the following, it might be important to know, which
>
> i have a question regarding query-request logging in solr-cloud. I've set the
> the "org.apache.solr.core.SolrCore.Request"-logger to INFO-level and its
> logging all those query-requests. So far so good. BUT, as I'm running Solr in
> cloud mode with 3 nodes an
Hey everybody,
i have a question regarding query-request logging in solr-cloud. I've set the
the "org.apache.solr.core.SolrCore.Request"-logger to INFO-level and its
logging all those query-requests. So far so good. BUT, as I'm running Solr in
cloud mode with 3 nodes and 3
I would not do this in Solr.
Post process the log file to split them out. That allows you to change the
definition of “slow” later, reprocess older files, etc.
Do log analysis with log analysis tools. Don’t try to push that too far up the
chain into the production server.
wunder
Walter
t;> Thanks,
> >> Atita
> >>
> >> On Tue, Oct 10, 2017 at 5:28 PM, Emir Arnautović <
> >> emir.arnauto...@sematext.com> wrote:
> >>
> >>> Hi Atita,
> >>> I did not try it, but I think that following coul
emir.arnauto...@sematext.com> wrote:
>>
>>> Hi Atita,
>>> I did not try it, but I think that following could work:
>>>
>>>
>>> #logging queries
>>> log4j.logger.org.apache.solr.handler.component.Query
Let me give them a quick try and I'll update you.
>
> Thanks,
> Atita
>
> On Tue, Oct 10, 2017 at 5:28 PM, Emir Arnautović <
> emir.arnauto...@sematext.com> wrote:
>
>> Hi Atita,
>> I did not try it, but I think t
Sure thanks Emir,
Let me give them a quick try and I'll update you.
Thanks,
Atita
On Tue, Oct 10, 2017 at 5:28 PM, Emir Arnautović <
emir.arnauto...@sematext.com> wrote:
> Hi Atita,
> I did not try it, but I think that following could work:
>
>
Hi Atita,
I did not try it, but I think that following could work:
#logging queries
log4j.logger.org.apache.solr.handler.component.QueryComponent=WARN,slow
log4j.appender.slow=org.apache.log4j.RollingFileAppender
log4j.appender.slow.File=${solr.log}/slow.log
log4j.appender.slow.layout
have made few more
changes to the logging levels and components.
Please find my log4j at : *https://pastebin.com/uTLAiBE5
<https://pastebin.com/uTLAiBE5>*
Any help on this will surely be appreciated.
Thanks again.
Atita
On Tue, Oct 10, 2017 at 1:39 PM, Emir Arnautović <
emir.arnauto...
Hi Atita,
You should definetely go with log4j configuration as anything else would be
redoing what log4j can do. You already have slowQueryThresholdMillies to make
slow queries log with WARN and you can configure log4j to put such logs (class
+ level) to a separate file.
This seems like
Hi ,
I have a situation here where I am required to log the slow queries into a
seperate log file which then can be used for optimization purposes.
For now this log is aggregated into the mainstream log marking
[slow:..].
I looked into the code and the configuration and I am really clueless
Hi,
I've noticed that in SOLR-7484 Solr part of http request was moved to
SolrHttpCall. So there is no way to handle
SolrQueryRequest and SolrQueryResponse in SolrDispatchFilter.
Internal requet logging is SolrCore.execute(SolrRequestHandler,
SolrQueryRequest, SolrQueryResponse
Hi,
I would like to ask how to implement search audit logging. I've implemented
some idea but I would like to ask if there is better approach to do this.
Requirement is to log username, search time, all request parameters (q, fq,
etc.), response data (count, etc) and important thing is to log
Thanks Shawn for the detailed context.
I saw some Logger (java.util.logging) in one class in lucene folder, hence
I thought that logging is now properly supported. Since, i am using solr
(and indirectly lucene), I will use whatever solr is using.
Not depending on any concrete logger is good
then you
> can route the infoStream logging to Solr's log files by setting an option
> in the solrconfig.xml. See
> http://lucene.apache.org/solr/guide/6_6/indexconfig-in-solrconfig.html#
> IndexConfiginSolrConfig-OtherIndexingSettings
>
> On Fri, Jul 28, 2017 at 11:13 AM,
o not know if you have
also sent this question to that list.
Solr uses slf4j for logging. Many of its dependencies have chosen other
logging frameworks.
https://www.slf4j.org/
With slf4j, you can utilize just about any supported logging
implementation to do the actual end logging. The end implem
Lucene does not use a logger framework. But if you are using Solr then you
can route the infoStream logging to Solr's log files by setting an option
in the solrconfig.xml. See
http://lucene.apache.org/solr/guide/6_6/indexconfig-in-solrconfig.html#IndexConfiginSolrConfig-OtherIndexingSettings
Any doughnut for me ?
Regards
Nawab
On Thu, Jul 27, 2017 at 9:57 AM Nawab Zada Asad Iqbal
wrote:
> Hi,
>
> I see a lot of discussion on this topic from almost 10 years ago: e.g.,
> https://issues.apache.org/jira/browse/LUCENE-1482
>
> For 4.5, I relied on
Hi,
I see a lot of discussion on this topic from almost 10 years ago: e.g.,
https://issues.apache.org/jira/browse/LUCENE-1482
For 4.5, I relied on 'System.out.println' for writing information for
debugging in production.
In 6.6, I notice that some classes in Lucene are instantiating a Logger,
59 PM, Bernd Fehling
>> <bernd.fehl...@uni-bielefeld.de> wrote:
>>> While looking into SolrCloud I noticed that my logging
>>> gets moved to archived dir by starting a new node.
>>>
>>> E.g.:
>>> bin/solr start -cloud -p 8983
>>
> <bernd.fehl...@uni-bielefeld.de> wrote:
>> While looking into SolrCloud I noticed that my logging
>> gets moved to archived dir by starting a new node.
>>
>> E.g.:
>> bin/solr start -cloud -p 8983
>> -> server/logs/ has solr-8983-console.log
>>
>
ird. Does the 7574 console log really get archived or
>>> is the 8983 console log archived twice? If 7574 doesn't get moved to
>>> the archive, this sounds like a JIRA, I'd go ahead and raise it.
>>>
>>> Actually either way I think it needs a JIRA. Either the w
at 5:59 PM, Bernd Fehling
<bernd.fehl...@uni-bielefeld.de> wrote:
> While looking into SolrCloud I noticed that my logging
> gets moved to archived dir by starting a new node.
>
> E.g.:
> bin/solr start -cloud -p 8983
> -> server/logs/ has solr-8983-console.log
>
&
ahead and raise it.
>>
>> Actually either way I think it needs a JIRA. Either the wrong log is
>> getting moved or the message needs to be fixed.
>>
>> Best,
>> Erick
>>
>> On Wed, May 3, 2017 at 5:29 AM, Bernd Fehling
>> <bernd.fehl...@uni-bielefeld.de&
IRA, I'd go ahead and raise it.
>>
>> Actually either way I think it needs a JIRA. Either the wrong log is
>> getting moved or the message needs to be fixed.
>>
>> Best,
>> Erick
>>
>> On Wed, May 3, 2017 at 5:29 AM, Bernd Fehling
>> <bernd.fe
May 3, 2017 at 5:29 AM, Bernd Fehling
> <bernd.fehl...@uni-bielefeld.de> wrote:
> > While looking into SolrCloud I noticed that my logging
> > gets moved to archived dir by starting a new node.
> >
> > E.g.:
> > bin/solr start -cloud -p 8983
> > -> serve
or the message needs to be fixed.
Best,
Erick
On Wed, May 3, 2017 at 5:29 AM, Bernd Fehling
<bernd.fehl...@uni-bielefeld.de> wrote:
> While looking into SolrCloud I noticed that my logging
> gets moved to archived dir by starting a new node.
>
> E.g.:
> bin/solr start -cloud -p 8983
While looking into SolrCloud I noticed that my logging
gets moved to archived dir by starting a new node.
E.g.:
bin/solr start -cloud -p 8983
-> server/logs/ has solr-8983-console.log
bin/solr start -cloud -p 7574
-> solr-8983-console.log is moved to server/logs/archived/
-> server/
nt, not sure whether the SOLR
>> supports it or not. We would like to have alerts in place if we have
>> the changes on the particular column for more than specific threshold
>> on said day.
>> For example: Say, we have column say "Name" on which we have say a
would like to have alerts in place if we have
>the changes on the particular column for more than specific threshold
>on said day.
>For example: Say, we have column say "Name" on which we have say around
>30% changes we would like an alert. I am not sure whether we have some
e have say around 30%
changes we would like an alert. I am not sure whether we have some logging
mechanism to get this done. Any ideas would be appreciated.
Thanks and Regards,
Preeti
NOTICE TO RECIPIENTS: This communication may contain confidential and/or
privileged informati
removing the colon crushed it. thanks
the reason i'm looking at this is the logging screen is not showing log
content...last check shows the spinning wheel to the left.
Time (Local)Level CoreLogger Message
No Events available
Last Check:4/7/2017, 10:26:43 AM
Google chrome, IE
ey log4j.appender.: file
> log4j:ERROR Could not instantiate appender named ": file".
>
>
> Here is my config file and the only thing i have changed is set level to
> FINEST in line 3. Otherwise this is the default file.
>
> # Logging level
> solr.log=logs
> log4j
fault file.
# Logging level
solr.log=logs
log4j.rootLogger=FINEST,: file, CONSOLE
log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=%-4r %-5p (%t) [%X{collection}
%X{shard} %X{r
Hi,
See this in 6.4.2:
2017-04-07 07:49:04.040
[recoveryExecutor-9-thread-1-processing-x:statements] INFO
org.apache.solr.search.SolrIndexSearcher *null* - Opening
[Searcher@75f8f3c5[statements]
realtime]
2017-04-07 07:49:04.054
[recoveryExecutor-9-thread-1-processing-x:statements] INFO
Glad to hear it's working. The trick (as you've probably discovered)
is to properly
map the meta-data to Solr fields. The extracting request handler does
this, but the
real underlying issue is that there's no real standard. Word docs
might have "last_editor",
PDFs might have just "author". And on
Got it all working with Tika and SolrJ. (Got the correct artifacts). Much
faster now too which is good. Thanks very much for your help.
Notice: This email and any attachments are confidential and may not be used,
published or redistributed without the prior written consent of the Institute
of
On 3/1/2017 6:59 PM, Phil Scadden wrote:
> Exceptions never triggered but metadata was essentially empty except
> for contentType, and content was always an empty string. I don’t know
> what parser was doing, but I gave up and with the extractHandler route
> instead which did at least build a full
Belay that. I found out why parser was just returning empty data - I didn’t
have the right artefact in maven. In case anyone else trips on this:
org.apache.tika
tika-core
1.12
org.apache.tika
tika-parsers
>Another side issue: Using the extracting handler for handling rich documents
>is discouraged. Tika (which is what is used by the extracting
>handler) is pretty amazing software, but it has a habit of crashing or
>consuming all the heap memory when it encounters a document that it doesn't
The logging is coming from application which is running in Tomcat. Solr itself
is running in the embedded Jetty.
And yes, another look at the log4j and I see that rootlogger is set to DEBUG.
I've changed that/
>On the Solr server side, the 6.4.x versions have a bug that causes extremely
&g
getPath());
> up.setParam("literal.location", idString);
> up.setParam("literal.access",access.toString());
> up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
> solr.request(up);
>
> All the logging generated by la
up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
solr.request(up);
All the logging generated by last line. I don’t have any httpclient.wire lines
in my log4j.properties (presume these are from httpclient.wire). What do I do
to turn this off?
Phil Scadden,
Thanks Alexandre,
I was on a load server so, couldn't change any code but I just enabled debug
logging from Admin UI and I was able to see query in solr log file; which I
turned off after 5 minutes. Thanks again for full list of options available for
various scenarios.
Regards,
Prateek
st Handler
will add itself? Or just what the calling URL has. Logging at different
steps will give you different answers here. Also, some parameters can be
sent in the request body, not just in the URL.
Now in terms of options:
One answer is that logging should be happening in the middleware that tal
If you enabled the logging for org.apache.solr.core you should be fine.
You can also go more fine grained if you don't need part of the logs.
Just remember that the UI will show only from the warning level.
If you want to see the query log you need to access the log files.
N.B. in a produciton
debug logs for org.apache.solr package (java) and enable all
logging levels from solr admin UI (image attached). I am hoping there should be
a simple way for achieving this and something silly is what I am missing here.
[cid:image001.png@01D28917.CA388300]
Regards,
Prateek Jain
Team: Totoro
s.c.S.Request [sial-catalog-material_shard1_replica1] webapp=/solr
> > path=/cdcr params={qt=/cdcr=BOOTSTRAP_STATUS=javabin&
> version=2}
> > status=0 QTime=0
>
> I hadn't looked closely at the messages you were seeing in your logs
> until now.
>
> These messages
ent: Tuesday 10th January 2017 15:10
> To: solr-user@lucene.apache.org
> Subject: RE: Debug logging in Maven project
>
> Indeed, there were some changes recently but i also can't get logging to work
> on older versions such as 6.0.
>
> Thanks,
> Markus
>
>
>
Indeed, there were some changes recently but i also can't get logging to work
on older versions such as 6.0.
Thanks,
Markus
-Original message-
> From:Pushkar Raste <pushkar.ra...@gmail.com>
> Sent: Tuesday 10th January 2017 14:53
> To: solr-user@lucene.apache.or
Seems like you have enabled only console appender. I remember there was a
changed made to disable console appender if Solr is started in background
mode.
On Jan 10, 2017 5:55 AM, "Markus Jelsma" <markus.jel...@openindex.io> wrote:
> Hello,
>
> I used to enable debug logg
Hello,
I used to enable debug logging in my Maven project's unit tests by just setting
log4j's global level to DEBUG, very handy, especially in debugging some Solr
Cloud start up issues. Since a while, not sure to long, i don't seem to be able
to get any logging at all. This project depends
]
> o.a.s.c.S.Request [sial-catalog-material_shard1_replica1] webapp=/solr
> path=/cdcr params={qt=/cdcr=BOOTSTRAP_STATUS=javabin=2}
> status=0 QTime=0
I hadn't looked closely at the messages you were seeing in your logs
until now.
These messages are *request* logging. This is the same cod
On 1/6/2017 8:21 AM, Webster Homer wrote:
> I figured our problem with the filesystem, by default the root logger
> is configured with the CONSOLE logger, which is NOT rotated and
> eventually filled up the file system. That doesn't exonerate the CDCR
> logging problem though. The
I figured our problem with the filesystem, by default the root logger is
configured with the CONSOLE logger, which is NOT rotated and eventually
filled up the file system. That doesn't exonerate the CDCR logging problem
though. The thing writes a huge amount of junk to the logs, information
1 - 100 of 454 matches
Mail list logo