Re: CSV parsing (large files)

2008-07-30 Thread Stephen Hoffman
From: Jacob Bandes-Storch I've got several large-size CSV files (a total of about 1.25 million lines, and an average of maybe 10 or so columns) that I want to parse into a 2D array. I found some parsing code that uses NSScanner, and it works fine with small files, but it's very resource-in

Re: CSV parsing (large files)

2008-07-30 Thread Andreas Monitzer
On Jul 30, 2008, at 08:14, Simone Tellini wrote: Keep a prepared SQLite insert statement and reuse it for all the lines, binding the parameters for each line. Don't load the whole file in memory: it's just a waste of memory and time to allocate it. Instead parse a line at a time. mmap() m

Re: CSV parsing (large files)

2008-07-29 Thread Simone Tellini
Il giorno 30/lug/08, alle ore 06:55, Jacob Bandes-Storch ha scritto: I've got several large-size CSV files (a total of about 1.25 million lines, and an average of maybe 10 or so columns) that I want to parse into a 2D array. I found some parsing code that uses NSScanner, and it works fine

CSV parsing (large files)

2008-07-29 Thread Jacob Bandes-Storch
I've got several large-size CSV files (a total of about 1.25 million lines, and an average of maybe 10 or so columns) that I want to parse into a 2D array. I found some parsing code that uses NSScanner, and it works fine with small files, but it's very resource-intensive and slow with large