What happens is I get a message from Adobe Reader "The file is damaged 
and could not be repaired."
It just quits sending data and everything stops normally - other that a 
truncated file.

The limit seems to be about 26,000,000 bytes.

AFAIK, a CGI model outputs its data to stdout, which the server(apache) 
streams into the TCP connection.
TWebModule streams  Response.ContentStream to stdout - it should not all 
be in memory at the same time.

Doug


Rob Kennedy wrote:
> jumpsfromplanes wrote:
>   
>> It is probably a Windows/memory problem. You just can't make file streams 
>> that big.
>>     
>
> 20 MB? Of course you can.
>
>   
>> Your filestream code tries to swallow the entire PDF file whole into memory 
>> and *then* send it via HTTP to the client.
>>     
>
> Doug's file-stream code doesn't do anything with the contents of the 
> file. Where do you see that? He opens the file and gives the stream to 
> the TWebResponse. TFileStream never caches the contents of the file.
>
>   
>> If you try to read a file that is too large, your filestream read code will 
>> fail.
>>     
>
> Trying to allocate too much memory will yield an EOutOfMemory exception. 
> Has that happened in this case?
>
>   
>> You probably should use a small (16-64K) buffer and "loop" through the PDF 
>> file, one buffer load at a time. Send the buffer to the client and repeat. A 
>> small buffer created as a local variable will work fine. Using a buffer lets 
>> you work on files of any size - multiple gigabytes, if necessary.
>>
>> Remember that the last buffer load is probably only partially filled - you 
>> will have to count the number of actual bytes read into the buffer and make 
>> sure you only send that many to the client. Yeah, I know this is the old 
>> school way to do file transfers, but it actually works with (very) large 
>> files.
>>     
>
> Notice that Doug didn't *write* any code for for copying the stream. Nor 
> should he. The SendStream methods of TApacheResponse and 
> TApacheTwoResponse copy the stream contents already. They use an 8 KB 
> buffer.
>
>   
>> I have a similar problem with a ^...@#&! web browser unit/component I use to 
>> spider an SSL-secured web site and download files. The guy that wrote the 
>> component also used TFileStream, and it bombs on files larger than about 
>> 20MB - great planning in this day and age of large files. I have a bunch of 
>> 2-4 GB files to download, so the component is useless to me until I get 
>> around to rewriting the idiot thing. :-(
>>     
>
> You and Doug should be more specific about what happens when the file is 
> larger than 20 MB. Doug says the file "won't correctly transfer," and 
> you say you have a "similar problem." What's incorrect about the 
> transfer? How much gets sent? How much arrives? Does Apache log anything 
> about the request? Is the ContentLength header correct?
>
>   
> ------------------------------------------------------------------------
>
>
> No virus found in this incoming message.
> Checked by AVG - www.avg.com 
> Version: 8.5.425 / Virus Database: 270.14.57/2492 - Release Date: 11/09/09 
> 12:11:00
>
>   


Reply via email to