Hi - instead of trying to make the system ingest such large files perhaps you 
can split the files in many small pieces. 
 
-----Original message-----
> From:mitra <mitra.re...@ornext.com>
> Sent: Tue 13-Nov-2012 09:05
> To: solr-user@lucene.apache.org
> Subject: Solr Indexing MAX FILE LIMIT
> 
>  Hello Guys
> 
> Im using Apache solr 3.6.1 on tomcat 7 for indexing csv files using curl on
> windows machine
> 
> ** My question is that what would be the max csv file size limit when doing
> a HTTP POST or while using the following curl command
> curl http://localhost:8080/solr/update/csv -F "stream.file=D:\eighth.csv" -F
> "commit=true" -F "optimize=true" -F "encapsulate="" -F "keepEmpty=true"
> 
> ** My requirement is quite large because we have to index CSV files ranging
> between 8 to 10 GB
> 
> ** What would be the optimum settings for index parameters like commit for
> better perfomance on a machine with 8gb RAM
> 
> Please guide me on it
> 
> Thanks in Advance
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-Indexing-MAX-FILE-LIMIT-tp4019952.html
> Sent from the Solr - User mailing list archive at Nabble.com.
> 

Reply via email to