Hi Kamal;
You said that: "Recently I observed, after indexing around 11,000
documents, further documents are not getting indexed." How do you think
that documents are not indexed? There may occur an error and indexing
process may be break. Do you see any error at your log file? On the other
hand h
Hi All
I tried to get the log from terminal.
The complete log I have put at the end of this email.
One place in log I see, it is logged as
Nov 26, 2013 5:46:24 AM org.apache.solr.core.SolrCore execute
INFO: [] webapp=/solr path=/update params={wt=json} status=0 QTime=1
Nov 26, 2013 5:46:25 AM
org.
Thanks Alejandro and Luis.
If I need to see logs, how can I see it. Is it stored in any default log
files.
I am using below command to start apache solr.
java -Xms64m -Xmx6g -jar start.jar &
I am using it along with Drupal 7.1.5 , I am trying to find out if it is a
Drupal issue or Apache solr i
Hello!
Checkout also your application server logs. Maybe you're trying to index
Documents with any syntax error and they are skipped.
Regards,
- Luis Cappa
2013/11/26 Alejandro Marqués Rodríguez
> Hi,
>
> In lucene you are supossed to be able to index up to 274 billion documents
> ( http://l
Hi,
In lucene you are supossed to be able to index up to 274 billion documents
( http://lucene.apache.org/core/3_0_3/fileformats.html#Limitations ), so in
Solr should be something like that. Anyway the maximum number is quite
bigger than those 11.000 ;)
Could it be that you are reusing IDs so the
Dear All
I am using Apache solr 3.6.2 with Drupal 7.
Users keeps adding their profiles (resumes) and with cron task from Drupal,
documents get indexed.
Recently I observed, after indexing around 11,000 documents, further
documents are not getting indexed.
Is there any configuration for max docume