a unix environment though
--
View this message in context:
http://lucene.472066.n3.nabble.com/Splitting-index-created-from-a-csv-using-solr-tp4018195p4018427.html
Sent from the Solr - User mailing list archive at Nabble.com.
On 6 November 2012 10:52, mitra mitra.re...@ornext.com wrote:
Thanks for the reply Gora
i just wanted to know if solr could do it by itself now from your answer i
could see its not possible
Yes, this is not a common use case.
So what do you think is the best way to split it I mean should
of the csv
also coming to index settings what should be the optimal value of auto
commit maxdocs and maxtime for the 10gb csv file it has around 28 milllion
records
--
View this message in context:
http://lucene.472066.n3.nabble.com/Splitting-index-created-from-a-csv-using-solr-tp4018192.html
Sent
of the csv
also coming to index settings what should be the optimal value of auto
commit maxdocs and maxtime for the 10gb csv file it has around 28 milllion
records
--
View this message in context:
http://lucene.472066.n3.nabble.com/Splitting-index-created-from-a-csv-using-solr-tp4018191.html
Sent
On 5 November 2012 11:11, mitra mitra.re...@ornext.com wrote:
Hello all
i have a csv file of size 10 gb which i have to index using solr
my question is how to index the csv in such a way so that
i can get two separate index files of which one of the index is the index
for the first half of
I would use the Unix split command. You can give it a line count.
% split -l 1400 myfile.csv
You can use wc -l to count the lines.
wunder
On Nov 4, 2012, at 10:23 PM, Gora Mohanty wrote:
On 5 November 2012 11:11, mitra mitra.re...@ornext.com wrote:
Hello all
i have a csv file of