, Augusto Camarotti augu...@prpb.mpf.gov.br wrote:
Hi guys,
I'm having a problem with solr when trying to index some broken .doc
files.
I have set up a test case using Solr to index all the files the users
save on the shared directorys of the company that i work for and Solr is
hanging
@386f9474
So, how do I prevent solr from hanging when trying to index broken files?
Regards,
Augusto Camarotti
);
//Augusto Camarotti - 28-11-2013
//As tika may parse more than one documents in one file, i have to
append every documento tika parses me,
//so, i will only append a whitespace and wait for new content
everytime. Otherwise, Solr would just get the last document of the file
into smaller chunks in order to make it
useful.
Best
Erick
On Fri, Jan 27, 2012 at 3:43 AM, Augusto Camarotti
augu...@prpb.mpf.gov.br wrote:
I'm talking about 2 GB files. It means that I'll have to allocate something
bigger than that for the JVM? Something like 2,5 GB?
Thanks,
Augusto Camarotti
I'm talking about 2 GB files. It means that I'll have to allocate something
bigger than that for the JVM? Something like 2,5 GB?
Thanks,
Augusto Camarotti
Erick Erickson erickerick...@gmail.com 1/25/2012 1:48 pm
Mostly it depends on your container settings, quite often that's
where
Hi everybody
Does anyone knows if there is a maximum file size that can be uploaded to the
extractingrequesthandler via http request?
Thanks in advance,
Augusto Camarotti