Hi,
 
I have a tool that reads a nightly uploaded CSV file, and imports the data
in it into a mySQL database. This works fine, after some messing around with
the code to allow for consecutive blank fields in the CSV and stuff like
that.
 
Only problem is, some of these CSV files can be quite large, upto 8MB, with
maybe 50,000 plus records, and these are giving CF server a bit of a
headache.
 
Is there a way I can split the CSV file after upload into, lets say, 8 1Mb
files instead, and then process each one in turn. Maybe using CFFILE to read
a limited number of rows, write them into another file and then delete them
from the original. I can't think of a way to do this.
 
Cheers
 
Will


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Create robust enterprise, web RIAs.
Upgrade & integrate Adobe Coldfusion MX7 with Flex 2
http://ad.doubleclick.net/clk;56760587;14748456;a?http://www.adobe.com/products/coldfusion/flex2/?sdid=LVNU

Archive: 
http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:264184
Subscription: http://www.houseoffusion.com/groups/CF-Talk/subscribe.cfm
Unsubscribe: 
http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4

Reply via email to