Simon Slavin was thinking very hard : > On 3 Jan 2011, at 9:08pm, GS wrote: > >> Looping through the array gives me access to each record if I start the >> loop at vaDataArray(1), thus the loop parameters of '1 To >> UBound(vaDataArray)'. > > I have twice suggest you do not do this. Holding the entire data in memory > at the same time will take a lot of memory and cause your program to run > slowly. If all you are doing is turning a CSV file into SQL you never need > to hold the entire file in memory. > > Read one line of your data from the file into an array. Generate one INSERT > command to write it to the database. Then reuse the same array to read in > the next line of your data: you don't need the previous line any more. This > should make your application run many times faster than it did before. > > Simon. > _______________________________________________ > sqlite-users mailing list > [email protected] > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
Actually, my apps run lightning fast using an array to hold the complete file. What I don't do (as I suggested) is use arrays when dealing with large amounts of data. As stated, I use ADO to parse the file into recordsets. This is way much faster than doing it one line at a time!<g> -- Garry Free usenet access at http://www.eternal-september.org ClassicVB Users Regroup! comp.lang.basic.visual.misc _______________________________________________ sqlite-users mailing list [email protected] http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

