The memory error is a result of how HTTP handles file uploads.

It loads the entire file into memory, then the server side app (CFFile in
this case) transfers it from memory to disk.

As a result, if your server doesn't have the pysical memory to handle a
large file, or if the memory is temporarily used by another process, you'll
get the "out of resources" error.

We put 512 Meg on our server which runs CF4.5.2 and SQL 2000.  The largest
file we can upload in this manner is 68 Meg.

HTH

Shawn Grover.

-----Original Message-----
From: Matt Lewis [mailto:[EMAIL PROTECTED]]
Sent: Monday, August 20, 2001 9:07 PM
To: CF-Talk
Subject: Loading extremely large files via CFFILE


Hi All.

I'm trying to load an extremely load file via CFFILE.  This file is the RDF 
content file from the Open Directory Project.

I've already built the script which imports the file, loads into a set of 
structures, and then strips out the URLs and inserts into a database.  This 
script has been tested with a subset of the entire RDF file.

However, when I try to run this script with the entire file (approximately 
820Megs in size), I get a memory error.  I've resorted to splitting the 
file into smaller segments, but each file is taking a huge amount of time 
to load (I've resorted to splitting the file into 10Meg segments).

Can others give me an idea of how they have managed extremely large files 
that need to be loaded into a database via CFFILE?  The file format is too 
complex to simply write a SQL import script.

Thanks.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Structure your ColdFusion code with Fusebox. Get the official book at 
http://www.fusionauthority.com/bkinfo.cfm
FAQ: http://www.thenetprofits.co.uk/coldfusion/faq
Archives: http://www.mail-archive.com/cf-talk@houseoffusion.com/
Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists

Reply via email to