Forget about using CF to read a file that big unless you've got a lot of memory in the server. If CF is your only tool for actually getting the data into your database, you might use something like Perl to first break the file up into many smaller files, then run those through CF. If you're a bit more proficient at Perl, just use Perl to read the file and insert the data in the db and be done with it.
Jim ----- Original Message ----- From: "Chris Giminez" <[EMAIL PROTECTED]> To: "CF-Talk" <[EMAIL PROTECTED]> Sent: Thursday, January 24, 2002 5:09 PM Subject: extracting data from a huge text file > I'm trying to retrieve data and put it into a database from a text file. > The text file is over 100MB with (I'm guessing) around 260,000 records, each record contains 85 > fields. > > I tried to create a conditional loop at each line break, which works fine if I am working on a > smaller version of the file, but chokes on the 100MB monster. Is there a more efficient way to deal > with this other than looping through it or is there a way to break this file up into smaller chunks > and deal with it piece by piece? ______________________________________________________________________ Dedicated Windows 2000 Server PIII 800 / 256 MB RAM / 40 GB HD / 20 GB MO/XFER Instant Activation · $99/Month · Free Setup http://www.pennyhost.com/redirect.cfm?adcode=coldfusiona FAQ: http://www.thenetprofits.co.uk/coldfusion/faq Archives: http://www.mail-archive.com/cf-talk@houseoffusion.com/ Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists