Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hello rouvas,

Tuesday, December 14, 2004, 2:45:16 PM, you wrote:

r> Not to the web root, but to an arbitrary named on-the-fly created dir
r> protected with a *custom* (and different for each dir) .htaccess file (and
r> accompanying htpasswd entries). Then, there would be no single pass to share.

Yes this is a possibility for sure - as long as the actual file itself
can be shared out (symlink perhaps?) because I sure as heck don't want
to have to create a separate folder with a 300MB file in it for every
customer. We'd fill our hard drives within hours :)

r> (a) PHP is slower than Apache

Absolutely.

r> (b) Apache can cache the 1MB files, at least some of them, and serve them to
r> the next client

Good point. I would love to ease-up the strain on the servers somehow.
I spent some time today adding the ability for the client download app
to grab each chunk from a different server if needs be (we have three
file servers) which at least will distribute the load somewhat.
Your suggestions were appreciated, thank you.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Greg Donald
On Mon, 13 Dec 2004 13:22:52 -0800 (GMT-08:00), Bruce Douglas
<[EMAIL PROTECTED]> wrote:
> you might also look into 'bit torrent'...

You must have missed the part about 'cannot have this file in a
"public" location'.

>Just thought I would pick the collective brain on this one. I have
>a requirement to deliver a large EXE file to customers after they
>order. The file is just under 400 MB in size and, because they have
>just purchased it, I obviously cannot have this file in a "public"
>location on the web server that someone could browse to.
> 
>I can push the file out quite easily using a modified header and a
>simple script to check if they can download it or not, but with
>such a large file a significant number of web browsers fail to
>obtain the entire EXE before closing - or any other number of
>factors kick into play (their PC resets, ISP disconnects, Windows
>crashes, etc).

Use set_time_limit(0); to prevent the timeout.  ignore_user_abort() is
pretty handy too.

If that doesn't work you might give them authenticated http access
with temporary passwords.  You can have the usernames and passwords in
a db and pass the proper auth headers with PHP.

>Some browsers support resuming download, but not when the file has
>been sent via the headers I use, also FTP is not an option as I
>cannot create and destroy FTP users on the server easily (or for
>that matter assume the customer knows how to perform FTP
>operations).

I feel your pain.

>I'm also aware that it's not such a hot idea to lock-up Apache for
>the time it takes to download the whole file, especially with a
>large number of users doing this.

Apache 2 is pretty good with multiple threads from what I hear.  I use
it but not in a production environment.


-- 
Greg Donald
Zend Certified Engineer
http://gdconsultants.com/
http://destiney.com/

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread James Stewart
On Dec 14, 2004, at 8:53 AM, Richard Davey wrote:
Tuesday, December 14, 2004, 1:33:07 PM, you wrote:
r> Why don't you take the PHP out of the loop entirely?
r> Make a dir into the Apache area with a custom .htaccess
r> (with usernames/passwords, etc) and put the required files there.
Then the files have to be within the web root and it'll only take one
person to share out the username/password. It needs controlling as to
who can download and how many times. PHP has to be in the loop
somewhere (although granted, not for the actual file delivery).
How's about using PHP to update a database of username/password pairs 
and then using something like apache mod_auth_mysql to authenticate a 
user logging in against the database?

You could then write a MySQL procedure to automatically update a field 
showing that that user has logged in X times and use cron to 
periodically remove users (or change their group etc) who have used up 
their logins, or parse the server logs periodically to extract the same 
information.

James.
--
James Stewart : Freelance Web Developer
Work : http://jystewart.net
Play : http://james.anthropiccollective.org
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread rouvas
On Tuesday 14 December 2004 15:53, Richard Davey wrote:
> Hello rouvas,
>
> Tuesday, December 14, 2004, 1:33:07 PM, you wrote:
>
> r> Why don't you take the PHP out of the loop entirely?
> r> Make a dir into the Apache area with a custom .htaccess
> r> (with usernames/passwords, etc) and put the required files there.
>
> Then the files have to be within the web root and it'll only take one
> person to share out the username/password.

Not to the web root, but to an arbitrary named on-the-fly created dir 
protected with a *custom* (and different for each dir) .htaccess file (and 
accompanying htpasswd entries). Then, there would be no single pass to share.
You can even make it time-limited, so as to expire after a predefined period.
And anyway, what's from stopping the user to share the file after it has been 
downloaded into theri machine?

> It needs controlling as to
> who can download and how many times. PHP has to be in the loop
> somewhere (although granted, not for the actual file delivery).

Sure you need to control it. But you need to control when, how (and if) the 
file gets to the client, not what or from where it gets served. which in my 
mind calls for something on the client side along the lines of your prog.

>
> r> From the thread I understood that you don't split the file into smaller
> r> chunks, but instead server chunks from the same big file. This is bad
> r> practice, as I've found out from personal experience. It is better to
> serve r> small files as they finish earlier and free the server processes.
>
> What's the difference between serving a small 1MB file, and reading in
> 1MB of data from a 300MB file, closing that read operation and then
> outputting the result? I cannot see how actually splitting the file
> into 1MB chunks on the server will make it finish earlier. 1MB of data
> is 1MB of data, regardless how PHP read it in. The only real advantage
> might be in disk seek times however, so PHP wouldn't have to seek into
> the middle of a large file for example.

Assuming, that (a) you are sharing the same big file and (b) the number of 
users downloading is "significant", then :
(a) PHP is slower than Apache
(b) Apache can cache the 1MB files, at least some of them, and serve them to 
the next client

> r> Also, this would allow users that already have other download
> accelerators r> installed to grab the files.
>
> Download accelerators need a direct link to the file itself. The
> moment we have that, we're back to square one again.

The url to the download accelerators could contain authentication info.

> If it was that simple then when you buy something like a Symantec

[...snip...]

I don't think it's complicated. BTW, I don't find your solution compilcated, 
on the contrary is quite straightforward. and to be honest I don't think 
there is any reason to change it.
I'm only replying to offer an alternative...

-Stathis

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Lynch
Richard Davey wrote:
>So I came up with an idea that I'd like your opinions on: I built a
>small but friendly Windows application (<50KB in size) that will
>connect to the web server via HTTPS, check the download credentials
>and if all is ok, it then downloads the file via HTTP in 1MB
>chunks. The file is just a single EXE file sat outside of my web
>root, and the PHP script that serves the file uses fopen() to open
>the file, then fseeks to the required section of it, reads in 1MB
>worth of data, closes the file and then echos this out (after
>suitable headers of course, shown below)

This sounds an awful lot like various web installers.

It's likely that there are pre-existing applications "out there" to do the
same thing as yours.

They might even support interrupted downloads better, or have other
features worth investigating.

For sure, having them be somebody else's code to be maintained has its
pros and cons.

It would be worth your time, maybe, to investigate them.  I'd suggest
starting with the traditional installer software vendors whose name you
always see when you install software.

>I'm aware my app is for Windows only (although I could easily port
>it to OS X), but the files they are downloading are PC games
>anyway, so it's no bad thing in this case.

I have been known to download a Windows app on my non-Windows work
computer, and then burn a CD to take it home.

Especially if it's 300MB -- where the bandwidth of the download machine is
more important to the user than the OS on it.

Granted, that's going to be a very very very small minority of users, but
it's something to consider -- Sooner or later, you are excluding some user
somewhere by limitin the download application to Windows users.

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread rouvas
[...snip...]

> RL> always see when you install software.
>
> That would lock us into a platform specific environment too :) You
> don't run an Install Shield web delivery system by executing the setup
> file on a Mac just because you're at work and can burn it to CD :) I
> was more interested in comments re: the PHP side of things anyway - is
> it better to be spitting out 1MB segments and then letting the process
> finish and Apache free-up that httpd session, or does it make no
> difference to PHP or memory load, etc and we can just blast out 300MB+
> files regardless.

Why don't you take the PHP out of the loop entirely?
Make a dir into the Apache area with a custom .htaccess
(with usernames/passwords, etc) and put the required files there.
Your app can download from there.
From the thread I understood that you don't split the file into smaller 
chunks, but instead server chunks from the same big file. This is bad 
practice, as I've found out from personal experience. It is better to serve 
small files as they finish earlier and free the server processes.
Also, this would allow users that already have other download accelerators 
installed to grab the files.

Just my 0.02 euros...

-Stathis

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hello Bruce,

Monday, December 13, 2004, 9:22:52 PM, you wrote:

BD> you might also look into 'bit torrent'...

Not really any use at all in this situation.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hello rouvas,

Tuesday, December 14, 2004, 1:33:07 PM, you wrote:

r> Why don't you take the PHP out of the loop entirely?
r> Make a dir into the Apache area with a custom .htaccess
r> (with usernames/passwords, etc) and put the required files there.

Then the files have to be within the web root and it'll only take one
person to share out the username/password. It needs controlling as to
who can download and how many times. PHP has to be in the loop
somewhere (although granted, not for the actual file delivery).

r> From the thread I understood that you don't split the file into smaller
r> chunks, but instead server chunks from the same big file. This is bad
r> practice, as I've found out from personal experience. It is better to serve
r> small files as they finish earlier and free the server processes.

What's the difference between serving a small 1MB file, and reading in
1MB of data from a 300MB file, closing that read operation and then
outputting the result? I cannot see how actually splitting the file
into 1MB chunks on the server will make it finish earlier. 1MB of data
is 1MB of data, regardless how PHP read it in. The only real advantage
might be in disk seek times however, so PHP wouldn't have to seek into
the middle of a large file for example.

r> Also, this would allow users that already have other download accelerators
r> installed to grab the files.

Download accelerators need a direct link to the file itself. The
moment we have that, we're back to square one again.

If it was that simple then when you buy something like a Symantec
product on-line they'd just give you a link to the file. But they
don't, you have to download their package installer app first. Large
game downloads work in a similar way (Direct2Disk, Gigex Download,
etc). I do not believe this is an uncommon practise, I just want my
server to not get hammered.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hello Greg,

Monday, December 13, 2004, 9:42:30 PM, you wrote:

GD> Use set_time_limit(0); to prevent the timeout.  ignore_user_abort() is
GD> pretty handy too.

Yeah, I have the time-out limit in there already (the client end will
detect for a time-out from the server as well).

GD> If that doesn't work you might give them authenticated http access
GD> with temporary passwords. You can have the usernames and passwords
GD> in a db and pass the proper auth headers with PHP.

I did think of this, and it gets around a few issues, but if they are
not running any software to manage the download, and are not using a
decent browser (ala FireFox) we still hit the same "cannot resume"
problem should the download abort.

>>I'm also aware that it's not such a hot idea to lock-up Apache for
>>the time it takes to download the whole file, especially with a
>>large number of users doing this.

GD> Apache 2 is pretty good with multiple threads from what I hear.  I use
GD> it but not in a production environment.

Most of our servers run 1.3 - which is perfectly good, no complaints
there, it's just HTTP itself was never even really designed for
extremely large file downloads, so I am wary of any single
server+browser solution.


Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re[2]: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hello Richard,

Tuesday, December 14, 2004, 12:02:19 AM, you wrote:

RL> This sounds an awful lot like various web installers.

Sure, there's nothing unique about the concept. The aim was to reduce
to load on the web server and make things a little easier for the end
user. You can code Install Shield to download from a site before
installing (which a lot of programs do), but it doesn't solve the
server side solution in this respect.

RL> It's likely that there are pre-existing applications "out there" to do the
RL> same thing as yours.

Yes, but not branded to my needs and I dare say not as compact either.

RL> They might even support interrupted downloads better, or have other
RL> features worth investigating.

The app I built supports interrupted downloads perfectly.

RL> It would be worth your time, maybe, to investigate them.  I'd suggest
RL> starting with the traditional installer software vendors whose name you
RL> always see when you install software.

That would lock us into a platform specific environment too :) You
don't run an Install Shield web delivery system by executing the setup
file on a Mac just because you're at work and can burn it to CD :) I
was more interested in comments re: the PHP side of things anyway - is
it better to be spitting out 1MB segments and then letting the process
finish and Apache free-up that httpd session, or does it make no
difference to PHP or memory load, etc and we can just blast out 300MB+
files regardless.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Bruce Douglas
hi..

you might also look into 'bit torrent'...

peace..


-Original Message-
From: Richard Davey <[EMAIL PROTECTED]>
Sent: Dec 13, 2004 11:53 AM
To: [EMAIL PROTECTED]
Subject: [PHP] Delivering large files via PHP (>300MB)

Hi all,

   Just thought I would pick the collective brain on this one. I have
   a requirement to deliver a large EXE file to customers after they
   order. The file is just under 400 MB in size and, because they have
   just purchased it, I obviously cannot have this file in a "public"
   location on the web server that someone could browse to.

   I can push the file out quite easily using a modified header and a
   simple script to check if they can download it or not, but with
   such a large file a significant number of web browsers fail to
   obtain the entire EXE before closing - or any other number of
   factors kick into play (their PC resets, ISP disconnects, Windows
   crashes, etc).

   Some browsers support resuming download, but not when the file has
   been sent via the headers I use, also FTP is not an option as I
   cannot create and destroy FTP users on the server easily (or for
   that matter assume the customer knows how to perform FTP
   operations).

   I'm also aware that it's not such a hot idea to lock-up Apache for
   the time it takes to download the whole file, especially with a
   large number of users doing this.
   
   So I came up with an idea that I'd like your opinions on: I built a
   small but friendly Windows application (<50KB in size) that will
   connect to the web server via HTTPS, check the download credentials
   and if all is ok, it then downloads the file via HTTP in 1MB
   chunks. The file is just a single EXE file sat outside of my web
   root, and the PHP script that serves the file uses fopen() to open
   the file, then fseeks to the required section of it, reads in 1MB
   worth of data, closes the file and then echos this out (after
   suitable headers of course, shown below)

   header('Content-Type: application/force-download');
   header('Content-Transfer-Encoding: Binary');
   header("Content-Length: $total_chunksize");
   header("Content-Disposition: attachment; filename=\"$chunkname\"");

   The Windows app performs various checks on the file segments as
   they download and eventually stitches the whole thing back together
   at the end (there is a "resume download" feature so you can come
   back to it at a later time if you need, or your ISP disconnects).

   A quick MD5 file integrity check with the server confirms the file has
   downloaded fully.

   I have tested this out on some massive files across a range of PCs
   and Windows installations and it works perfectly, so I'm happy that
   the Windows side of things is correct. But I would be interested to
   hear peoples views on the PHP side of the equation - would it be
   better for Apache to be running PHP scripts that shove out smaller
   1MB chunks as opposed to doing a fpassthru on a 300MB+ file? Or do
   you think there is another more elegant solution?

   I'm aware my app is for Windows only (although I could easily port
   it to OS X), but the files they are downloading are PC games
   anyway, so it's no bad thing in this case.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Delivering large files via PHP (>300MB)

2004-12-13 Thread Richard Davey
Hi all,

   Just thought I would pick the collective brain on this one. I have
   a requirement to deliver a large EXE file to customers after they
   order. The file is just under 400 MB in size and, because they have
   just purchased it, I obviously cannot have this file in a "public"
   location on the web server that someone could browse to.

   I can push the file out quite easily using a modified header and a
   simple script to check if they can download it or not, but with
   such a large file a significant number of web browsers fail to
   obtain the entire EXE before closing - or any other number of
   factors kick into play (their PC resets, ISP disconnects, Windows
   crashes, etc).

   Some browsers support resuming download, but not when the file has
   been sent via the headers I use, also FTP is not an option as I
   cannot create and destroy FTP users on the server easily (or for
   that matter assume the customer knows how to perform FTP
   operations).

   I'm also aware that it's not such a hot idea to lock-up Apache for
   the time it takes to download the whole file, especially with a
   large number of users doing this.
   
   So I came up with an idea that I'd like your opinions on: I built a
   small but friendly Windows application (<50KB in size) that will
   connect to the web server via HTTPS, check the download credentials
   and if all is ok, it then downloads the file via HTTP in 1MB
   chunks. The file is just a single EXE file sat outside of my web
   root, and the PHP script that serves the file uses fopen() to open
   the file, then fseeks to the required section of it, reads in 1MB
   worth of data, closes the file and then echos this out (after
   suitable headers of course, shown below)

   header('Content-Type: application/force-download');
   header('Content-Transfer-Encoding: Binary');
   header("Content-Length: $total_chunksize");
   header("Content-Disposition: attachment; filename=\"$chunkname\"");

   The Windows app performs various checks on the file segments as
   they download and eventually stitches the whole thing back together
   at the end (there is a "resume download" feature so you can come
   back to it at a later time if you need, or your ISP disconnects).

   A quick MD5 file integrity check with the server confirms the file has
   downloaded fully.

   I have tested this out on some massive files across a range of PCs
   and Windows installations and it works perfectly, so I'm happy that
   the Windows side of things is correct. But I would be interested to
   hear peoples views on the PHP side of the equation - would it be
   better for Apache to be running PHP scripts that shove out smaller
   1MB chunks as opposed to doing a fpassthru on a 300MB+ file? Or do
   you think there is another more elegant solution?

   I'm aware my app is for Windows only (although I could easily port
   it to OS X), but the files they are downloading are PC games
   anyway, so it's no bad thing in this case.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php