Hi group,

i'm currently working on an extension to exchange large (huge) files (upload, download) between fe_users and be_users. I want the extension to be independent from any provider restrictions (PHP limits, Apache limits) as much as possible.

The upload part is already working fine using html5 file api and chunked uploads. It is possible to upload e.g. 2 GB sized files even when the site is hosted on any shared webspace (1&1 etc.).

The problem crrently is the download part: As the files are not accessable directly, the download is handeled through php and there are several things to take care about:

- the file must not be completely read into memory to be indepedent from php "memory_limit" restrictions - the process must not take too long to be independent from "max_execution_time" restrictions
- and so on...

Could anybody give me a hint about how to realize the download process best respecting lare files and most independence from provider restrictions? There seem to be several possibilities like using readfile(), fopen(), fread(), fpassthru() etc. or a mixture of them but i still didn't find a 100% solid solution.

Thanks in advice an regards, Jan


_______________________________________________
TYPO3-english mailing list
TYPO3-english@lists.typo3.org
http://lists.typo3.org/cgi-bin/mailman/listinfo/typo3-english

Reply via email to