Re: [PHP] a download limitation script that defies logic!
Daniel Brown-5 wrote: > > On Thu, Jun 26, 2008 at 8:38 AM, szalinski <[EMAIL PROTECTED]> > wrote: >> >> I thought I'd take the time to make a download limitation script, but >> even >> though the idea of this script is 'simple', it 'simply' refuses to work >> entirely properly. > [snip!] > > > You do realize that the code in this block below is only accessed > if $downloads is greater than or equal to $dl_limit_perday, correct? > >>if($has_hit_dl_limit) >>{ >>if($has_hit_timeout_limit) >>{ >>$dl = false; >>print('Too many downloads today!'); >>print "Time limit: {$time_limit}Last time >> accessed: {$last_access}"; >>print "Time to wait = $time_to_wait seconds"; >>exit; >>} >>else >>{ >>mysql_query("UPDATE ip_limit SET downloads = 0 >> WHERE >> userip = '$userip'"); >>//$dl = true; >>} >>} >>else >>{ >>$dl = true; >>} > > > As a result, only when it meets that condition does it set $dl to > TRUE for the next block. > >> if ($dl) >> { >>mysql_query("REPLACE INTO ip_limit (userip,last_access,downloads) >> VALUES ('$userip', NOW(), '$downloads'+1)"); >>download_the_file(); >> } >> else >> { >>exit, etc >> } > > > -- > > Dedicated Servers - Intel 2.4GHz w/2TB bandwidth/mo. starting at just > $59.99/mo. with no contract! > Dedicated servers, VPS, and hosting from $2.50/mo. > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > > > Yep, I realise it - that's exactly the bahaviour I expect...if the user has hit the download limit, then the script uses the inner if($has_hit_timeout_limit) stament (relating to the timeout), and keeps doing so on each request until the timeout is over (since download value is still over the limit for each request) - until at which point the inner if() statement should reset the downloads to 0, and then download the file, and increment the downloads column up by one again. but for some reason, if i try to put $dl = true after the UPDATE (in the inner if($has_hit_timeout_limit) statement), the download works, but the UPDATE never happens, so i get an ever-increasing value for 'downloads' in the db, and only 1 download, then timer, then 1 download, then timer, etc. it really is bizarre! :( -- View this message in context: http://www.nabble.com/a-download-limitation-script-that-defies-logic%21-tp18132754p18135426.html Sent from the PHP - General mailing list archive at Nabble.com. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] a download limitation script that defies logic!
I thought I'd take the time to make a download limitation script, but even though the idea of this script is 'simple', it 'simply' refuses to work entirely properly. Basically I setup a real quick timelimit, $time_limit should mean 'within the last 20 seconds'. What I am looking for is the following: a user requests a link via the script, i've got all that part right. what we do first is check to see if there is info in the db about the user who is identified by their ip. if so, use it to calculate stuff, if not, download the file, record their ip and update the download counter by 1. what i want to happen is only allow a certain ip to download a max of 2 files per give time period. the problem is the script 'works' as expected if i replace else { mysql_query("UPDATE ip_limit SET downloads = 0 WHERE userip = '$userip'"); //$dl = true; } with else { mysql_query("UPDATE ip_limit SET downloads = 0 WHERE userip = '$userip'"); echo "we updated the db successfully!"; } . But if i replace the echo "we updated the db successfully!"; with $dl = true, (because I want there to be a download right after this reset, rather than just a notice of a download-record-reset), then all i get is 1 download, then timeout, then 1 download, recurring. This just defies the laws of logic, or i am insane. Here is the code snippet that deals with the limitations (followed by the download script in "if ($dl)" - for any brave soul out there. Feel free to rewrite or suggest code changes, i'm all ears! $dl_limit_perday = 2; $time_limit = time() - 20; db_connect($dbhost, $dbname, $dbuser, $dbpass); $dl_limit_perday = 2; $time_limit = time() - 20; $get_ip_info = mysql_query("SELECT UNIX_TIMESTAMP(last_access),downloads FROM ip_limit WHERE userip = '$userip'"); $ip_info = mysql_fetch_array($get_ip_info); $last_access = $ip_info['UNIX_TIMESTAMP(last_access)']; $downloads = $ip_info['downloads']; $time_to_wait = $last_access - $time_limit; $has_hit_dl_limit = ($downloads >= $dl_limit_perday) ? true : false; $has_hit_timeout_limit = ($last_access > $time_limit) ? true : false; // if($num_dl > 3) { if($time_limit > $blah) { reset_downloads(); process_download(); } else { echo "GTFO" } } else { process_download(); } if($has_hit_dl_limit) { if($has_hit_timeout_limit) { $dl = false; print('Too many downloads today!'); print "Time limit: {$time_limit}Last time accessed: {$last_access}"; print "Time to wait = $time_to_wait seconds"; exit; } else { mysql_query("UPDATE ip_limit SET downloads = 0 WHERE userip = '$userip'"); //$dl = true; } } else { $dl = true; } if ($dl) { mysql_query("REPLACE INTO ip_limit (userip,last_access,downloads) VALUES ('$userip', NOW(), '$downloads'+1)"); download_the_file(); } else { exit, etc } -- View this message in context: http://www.nabble.com/a-download-limitation-script-that-defies-logic%21-tp18132754p18132754.html Sent from the PHP - General mailing list archive at Nabble.com. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] How to download a file (with browser) using fsockopen() ?
Thanks for this, it seems to work generally ok, but with one problem. If i get prompted to download the fie, there is no filesize (content-length is missing), so i wouldn't be able to resume the file if it was big! I tried adding another function that gets the headers from the file first, (to find the content-length), but instead I get an error message stating 'Cannot modify header information - headers already sent...', so I'm thinking that i am now in a catch-22 situation. I need to get the headers to find the content-length, and then send it to the browser to prompt the download of the file with the correct filesize. Also i noticed u used http 1.1, which i simply changed to 1.0 - hope this is enough to avoid chunked decoding. I also changed the method to POST as i need it. so either is it a paradox, or am i just misunderstanding where the headers should be placed? thanks :) On Tue, 05 Feb 2008 07:16:31 -, <[EMAIL PROTECTED]> wrote: Try this, it should work. //Get the file from the remote location function getFile($host, $resource, $port) { ??? $hdr = ''; ??? $file_cont = ''; ??? $fh = fsockopen($host, $port, $errno, $errstr, 300); ??? ??? if(! $fh) ??? { ??? ??? return "error"; ??? } else { ??? ??? $hdr .= "GET /$resource HTTP/1.1 \r\n"; ??? ??? $hdr .= "Host: $host\r\n"; ??? ??? $hdr .= "Connection: close\r\n\r\n"; ??? ??? fwrite($fh, $hdr); ??? ??? while(!feof($fh)) ??? ??? { ??? ??? ??? $file_cont .= fgets($fh, 128); ??? ??? } ??? ??? //Return the file as a string ??? ??? return $file_cont; ??? } } //Set up essential headers header("Content-Type: Application/GIF"); header("Content-Disposition: application/gif; filename=one.gif"); //Strip the text headers in the file and print it out. print preg_replace("/^.*\r\n/m", "", getFile("wisdomleaf.com", "images/logo.gif", 80)); Anyway, the core of the script is download the file as a string, print it out as a string with suitable headers. Cheers, V -Original Message- From: szalinski <[EMAIL PROTECTED]> To: php-general@lists.php.net; php-general@lists.php.net Sent: Tue, 5 Feb 2008 8:58 am Subject: [PHP] How to download a file (with browser) using fsockopen() ? Hi? ? I have been working on this download script for quite a while, and I just can't find how to download a remote file via a user's browser using fsockopen.? ? Basically I am wondering if anyone can just give me a simple working example on how to use fsockopen() to fetch a file on a remote server, and then pop up a save dialog box in my browser. For example, let's say I want to download this file from here:? ? http://remotedomain.com/file.zip? ? Instead of putting this directly into my browser and then being prompted to save it to my pc, how can i use fsockopen() to fetch the file, and get the same prompt on my browser? E.g. I want to be able to do? ? http://localhost/index.php?url=http://remotedomain.com/file.zip? ? I know that this does not seem the most obvious and easy way to do it, but i simply cannot get a file to download myself using fsockopen. I specifically want this function, as I need to POST headers to the server and I haven't as yet been able to download a file using it, without it being corrupt, or the connection hanging. I just can't figure it out, and I'm getting a bit tired with it!? I don't need a whole hand-made script, I just need the part where fsockopen will download this file. Perhaps a working function that would do it. Please try not to use classes or objects because I haven't quite figured out object-oriented programming yet!!? ? Also, I would like if you can do it via HTTP 1.0 because I know HTTP 1.1 is tricky, and might require a chunk decoder, and i don't see the need for it, unless someone is able to provide a working chunked data decoder.? ? Thanks to anyone who can help. :)? ? --PHP General Mailing List (http://www.php.net/)? To unsubscribe, visit: http://www.php.net/unsub.php? You are invited to Get a Free AOL Email ID. - http://webmail.aol.in -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] How to download a file (with browser) using fsockopen() ?
Hi I have been working on this download script for quite a while, and I just can't find how to download a remote file via a user's browser using fsockopen. Basically I am wondering if anyone can just give me a simple working example on how to use fsockopen() to fetch a file on a remote server, and then pop up a save dialog box in my browser. For example, let's say I want to download this file from here: http://remotedomain.com/file.zip Instead of putting this directly into my browser and then being prompted to save it to my pc, how can i use fsockopen() to fetch the file, and get the same prompt on my browser? E.g. I want to be able to do http://localhost/index.php?url=http://remotedomain.com/file.zip I know that this does not seem the most obvious and easy way to do it, but i simply cannot get a file to download myself using fsockopen. I specifically want this function, as I need to POST headers to the server and I haven't as yet been able to download a file using it, without it being corrupt, or the connection hanging. I just can't figure it out, and I'm getting a bit tired with it! I don't need a whole hand-made script, I just need the part where fsockopen will download this file. Perhaps a working function that would do it. Please try not to use classes or objects because I haven't quite figured out object-oriented programming yet!! Also, I would like if you can do it via HTTP 1.0 because I know HTTP 1.1 is tricky, and might require a chunk decoder, and i don't see the need for it, unless someone is able to provide a working chunked data decoder. Thanks to anyone who can help. :) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] How to download a file (with browser) using fsockopen() ?
Hi I have been working on this download script for quite a while, and I just can't find how to download a remote file via a user's browser using fsockopen. Basically I am wondering if anyone can just give me a simple working example on how to use fsockopen() to fetch a file on a remote server, and then pop up a save dialog box in my browser. For example, let's say I want to download this file from here: http://remotedomain.com/file.zip Instead of putting this directly into my browser and then being prompted to save it to my pc, how can i use fsockopen() to fetch the file, and get the same prompt on my browser? E.g. I want to be able to do http://localhost/index.php?url=http://remotedomain.com/file.zip I know that this does not seem the most obvious and easy way to do it, but i simply cannot get a file to download myself using fsockopen. I specifically want this function, as I need to POST headers to the server and I haven't as yet been able to download a file using it, without it being corrupt, or the connection hanging. I just can't figure it out, and I'm getting a bit tired with it! I don't need a whole hand-made script, I just need the part where fsockopen will download this file. Perhaps a working function that would do it. Please try not to use classes or objects because I haven't quite figured out object-oriented programming yet!! Also, I would like if you can do it via HTTP 1.0 because I know HTTP 1.1 is tricky, and might require a chunk decoder, and i don't see the need for it, unless someone is able to provide a working chunked data decoder. Thanks to anyone who can help. :) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] How to download a file (with browser) using fsockopen() ?
Hi I have been working on this download script for quite a while, and I just can't find how to download a remote file via a user's browser using fsockopen. Basically I am wondering if anyone can just give me a simple working example on how to use fsockopen() to fetch a file on a remote server, and then pop up a save dialog box in my browser. For example, let's say I want to download this file from here: http://remotedomain.com/file.zip Instead of putting this directly into my browser and then being prompted to save it to my pc, how can i use fsockopen() to fetch the file, and get the same prompt on my browser? E.g. I want to be able to do http://localhost/index.php?url=http://remotedomain.com/file.zip I know that this does not seem the most obvious and easy way to do it, but i simply cannot get a file to download myself using fsockopen. I specifically want this function, as I need to POST headers to the server and I haven't as yet been able to download a file using it, without it being corrupt, or the connection hanging. I just can't figure it out, and I'm getting a bit tired with it! I don't need a whole hand-made script, I just need the part where fsockopen will download this file. Perhaps a working function that would do it. Please try not to use classes or objects because I haven't quite figured out object-oriented programming yet!! Also, I would like if you can do it via HTTP 1.0 because I know HTTP 1.1 is tricky, and might require a chunk decoder, and i don't see the need for it, unless someone is able to provide a working chunked data decoder. Thanks to anyone who can help. :) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Timeout while waiting for a server->client transfer to start (large files)
Well thanks again, but I already know what the problem is, it is the response headers being added to the ouput file. I just tried with a different code and it seems to output the file ok, so i must be going wrong somewhere in the order in which i output headers and so on. i'm gonna keep working on it, i think i have to start from scratch though to see where i made the mistake exactly. :) On Mon, 04 Feb 2008 21:13:42 -, Richard Lynch <[EMAIL PROTECTED]> wrote: On Fri, February 1, 2008 7:45 pm, szalinski wrote: On Thu, 31 Jan 2008 07:13:55 -, Per Jessen <[EMAIL PROTECTED]> wrote: Well I got it to work, much thanks to Richard Lynch, but now everytime I download a file, it is corrupt. For example, when I download small .rar file, just to test, it is always corrupt ('Unexpected end of archive'). I also cleared my browser cache just to be sure, but same problem. Here is the code as it stands. I just can't get my head around why it wouldn't be working as it is... Open the file you download with a text or hex editor. Compare to the original. If you can't spot the problem right off, download the original with FTP and use "diff" to compare the two. Or, if you don't have "diff", upload the broken download with FTP and use "diff" on the server. If it's OK on the server, and not coming out OK in the download, figure out what's different between the two. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Timeout while waiting for a server->client transfer to start (large files)
Thanks I have already another post dealing with this issue. (check newsgroup for Server to client file transfer always corrupt). I had figured out the problem that was corrupting the file, it is the response headers that are being added to the file when it is downloaded, and i don't know how to trim them. I am using a different trim method i 'borrowed' from another script (where it seems to work perfectly!), and I just can't understand why it won't work for me, unless as u say, there are other errors being output, but I have looked at the text file and I don't see anything other than the headers there. When I use the trim script to remove them, it also seems to remove part of the start and end of the text file. I think the most annoying aspect of whole thing is why it is so difficult to find a solution to this problem, i'm sure other people must have come across this problem? thanks again :) On Sun, 03 Feb 2008 16:28:34 -, Richard Lynch <[EMAIL PROTECTED]> wrote: You can use filesize() to get the file size... http://php.net/filesize If that's not going to work because you are stripping out part of the file, or something, Use fgets to get the header info, and then use fseek to reset the file pointer to the beginning: http://php.net/fseek You can then read as much or as little as you like with fread. As far as the corrupt files go, compare what you got with the download and the original in a text editor or a hex editor to see what happened. You might have some PHP warnings or notices at the front of the file, or at the end, messing the file contents up. On Thu, January 31, 2008 12:11 pm, szalinski wrote: On Thu, 31 Jan 2008 00:02:55 -, Richard Lynch <[EMAIL PROTECTED]> wrote: Hello Richard Well, thank you for pointing that out to me! I was actually trying to read it into RAM, but that was a silly mistake. But now I have the problem that, even though you are correct, the problem seems to still remain, in the sense that I actually *need* to read the start of the file just to get the header info (so i can retrieve the Content-Length, and the filename). Since I tried your method above, I thought I had got the script to finally work - which it seemed to - but every file I download with it is corrupt. I tried downloading a WinRAR file, and I get 'unexpected end of archive'. :( I know for a fact that the archive itself is NOT corrupt, because I tried it with various different files and all of them ended up corrupt! I found some info on the net about fread() and fgets(), and it seems that fgets() only reads one line of data up to 1024 bytes, or in my case, 64 bytes. This is what I want to happen, because I need the 'reading' to abort/break when I have read the required info from the header...I wish (and hope!) there was an easier way to get this info other than searching for it. :| Now I am truly vexed because the files are all corrupt when downloaded, because I can't see anything wrong with the code. I have added a few comments, so you can see what I think is the problematic area. By the way, many thanks for your enthusiastic help so far! - I hope you don't take this email to mean I will be mailing you frequently, believe me I know you must be busy, and I only mailed you as a last resort! Thanks again! 0) { $url = @parse_url($link); $fp = @fsockopen($url['host'], 80, $errno, $errstr); if (!$fp) { $errormsg = "Error: $errstr, please try again later."; echo $errormsg; exit; } $vars = "dl.start=PREMIUM&uri={$url['path']}&directstart=1"; $out = "POST {$url['path']} HTTP/1.1\r\n"; $out .= "Host: {$url['host']}\r\n"; $out .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $out .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $out .= "Content-Type: application/x-www-form-urlencoded\r\n"; $out .= "Content-Length: ".strlen($vars)."\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); fwrite($fp, $out.$vars); unset($string); while (!feof($fp)) { $string.= fgets($fp, 64); } //Tell us what data is returned //print($string); @fclose($fp); if (stristr($string, "Location:")) {
[PHP] Re: Server to client file transfer with authorization: file always corrupt
On Sat, 02 Feb 2008 23:08:43 -, Nathan Rixham <[EMAIL PROTECTED]> wrote: szalinski wrote: Hi I am having trouble with a file transfer script, as you can see, I am trying trying to keep the code as simple as possible. But everytime I download a file with it, it is corrupt. For example, when I download a small .rar file, just to test, it is always corrupt ('Unexpected end of archive'). I also cleared my browser cache just to be sure, but same problem. I just can't get my head around why it wouldn't be working as it is... A couple of questions: Have I got too many header requests? Do I need to worry about output buffering, which is possibly corrupting the file output (if so, please tell me what to do!)? Is there an easier way to get returned response header and get a redirected link, instead of finding and cutting strings? Is there maybe something wrong with the structure or order of the header requests, and/or returned headers etc? Here is what I have so far: 0) { $url = @parse_url($link); $fp = @fsockopen($url['host'], 80, $errno, $errstr); if (!$fp) { $errormsg = "Error: $errstr, please try again later."; echo $errormsg; exit; } $vars = "dl.start=PREMIUM&uri={$url['path']}&directstart=1"; $out = "POST {$url['path']} HTTP/1.1\r\n"; $out .= "Host: {$url['host']}\r\n"; $out .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $out .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $out .= "Content-Type: application/x-www-form-urlencoded\r\n"; $out .= "Content-Length: ".strlen($vars)."\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); fwrite($fp, $out.$vars); while (!feof($fp)) { $string .= fgets($fp, 256); } //Tell us what data is returned //print($string); @fclose($fp); if (stristr($string, "Location:")) { $redirect = trim(cut_str($string, "Location:", "\n")); $full_link = addslashes(trim($redirect)); } //print($string); //print("".$full_link.""); if ($full_link) { //Get info about the file we want to download: $furl = parse_url($full_link); $fvars = "dl.start=PREMIUM&uri={$furl['path']}&directstart=1"; $head = "Host: {$furl['host']}\r\n"; $head .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $head .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $head .= "Content-Type: application/x-www-form-urlencoded\r\n"; $head .= "Content-Length: ".strlen($fvars)."\r\n"; $head .= "Connection: close\r\n\r\n"; $fp = @fsockopen($furl['host'], 80, $errno, $errstr); if (!$fp) { echo "The script says $errstr, please try again later."; exit; } fwrite($fp, "POST {$furl['path']} HTTP/1.1\r\n"); fwrite($fp, $head.$fvars); while (!feof($fp)) { //Keep reading the info until we get the filename and size from the returned Header - is there no easy way //of doing this? I also don't like the way I have to 'find' the redirected link (above).?? $tmp .= fgets($fp, 256); $d = explode("\r\n\r\n", $tmp); // I tried changing this to if ($d), { etc.., (instead of $d[1]) and the download of the rar file *wasn't* corrupt, it just had a filetype of x-rar-compressed instead of //application/octet-stream, and the filesize was 'unknown' - now this is just confusing me...! So i think (and guess) the problem of the file corruption is here, //because it must add some data to the filestream which corrupts it. Darn. if($d[1]) { preg_match("#filename=(.+?)\n#", $tmp, $fname); preg_match("#Content-Length: (.+?)\n#", $tmp, $fsize); $h['filename'] = $fname[1] != "" ? $fname[1] : basename($furl['path']); $h['fsize'] = $fsize[1]; break; } } @fclose($fp); $filename = $h['filename']; $fsize = $h['fsize']; //Now automatically download the file: @header(&q
[PHP] Server to client file transfer with authorization: file always corrupt
Hi I am having trouble with a file transfer script, as you can see, I am trying trying to keep the code as simple as possible. But everytime I download a file with it, it is corrupt. For example, when I download a small .rar file, just to test, it is always corrupt ('Unexpected end of archive'). I also cleared my browser cache just to be sure, but same problem. I just can't get my head around why it wouldn't be working as it is... A couple of questions: Have I got too many header requests? Do I need to worry about output buffering, which is possibly corrupting the file output (if so, please tell me what to do!)? Is there an easier way to get returned response header and get a redirected link, instead of finding and cutting strings? Is there maybe something wrong with the structure or order of the header requests, and/or returned headers etc? Here is what I have so far: 0) { $url = @parse_url($link); $fp = @fsockopen($url['host'], 80, $errno, $errstr); if (!$fp) { $errormsg = "Error: $errstr, please try again later."; echo $errormsg; exit; } $vars = "dl.start=PREMIUM&uri={$url['path']}&directstart=1"; $out = "POST {$url['path']} HTTP/1.1\r\n"; $out .= "Host: {$url['host']}\r\n"; $out .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $out .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $out .= "Content-Type: application/x-www-form-urlencoded\r\n"; $out .= "Content-Length: ".strlen($vars)."\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); fwrite($fp, $out.$vars); while (!feof($fp)) { $string .= fgets($fp, 256); } //Tell us what data is returned //print($string); @fclose($fp); if (stristr($string, "Location:")) { $redirect = trim(cut_str($string, "Location:", "\n")); $full_link = addslashes(trim($redirect)); } //print($string); //print("".$full_link.""); if ($full_link) { //Get info about the file we want to download: $furl = parse_url($full_link); $fvars = "dl.start=PREMIUM&uri={$furl['path']}&directstart=1"; $head = "Host: {$furl['host']}\r\n"; $head .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $head .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $head .= "Content-Type: application/x-www-form-urlencoded\r\n"; $head .= "Content-Length: ".strlen($fvars)."\r\n"; $head .= "Connection: close\r\n\r\n"; $fp = @fsockopen($furl['host'], 80, $errno, $errstr); if (!$fp) { echo "The script says $errstr, please try again later."; exit; } fwrite($fp, "POST {$furl['path']} HTTP/1.1\r\n"); fwrite($fp, $head.$fvars); while (!feof($fp)) { //Keep reading the info until we get the filename and size from the returned Header - is there no easy way //of doing this? I also don't like the way I have to 'find' the redirected link (above).?? $tmp .= fgets($fp, 256); $d = explode("\r\n\r\n", $tmp); // I tried changing this to if ($d), { etc.., (instead of $d[1]) and the download of the rar file *wasn't* corrupt, it just had a filetype of x-rar-compressed instead of //application/octet-stream, and the filesize was 'unknown' - now this is just confusing me...! So i think (and guess) the problem of the file corruption is here, //because it must add some data to the filestream which corrupts it. Darn. if($d[1]) { preg_match("#filename=(.+?)\n#", $tmp, $fname); preg_match("#Content-Length: (.+?)\n#", $tmp, $fsize); $h['filename'] = $fname[1] != "" ? $fname[1] : basename($furl['path']); $h['fsize'] = $fsize[1]; break; } } @fclose($fp); $filename = $h['filename']; $fsize = $h['fsize']; //Now automatically download the file: @header("Cache-Control:"); @header("Cache-Control: public"); @header("Content-Type: application/octet-stream"); @header("Content-Disposition: attachment; filename=".$filename); @header("Accept-Ranges: bytes"); if(isset($_SERVER['HTTP_RANGE'])) { list($a, $range)=explode("=",$_SERVER['HTTP_RANGE']); $range = str_replace("-", "", $range); $new_length = $fsize - $range; @header("HTTP/1.1 206 Partial Content"); @header("Content-Length: $new_length"); } else { @header("Content-Length: ".$fsize); } $f2vars = "dl.st
Re: [PHP] Timeout while waiting for a server->client transfer to start (large files)
On Thu, 31 Jan 2008 07:13:55 -, Per Jessen <[EMAIL PROTECTED]> wrote: Richard Lynch wrote: Your script is reading the whole file, 64 measly bytes at a time, into a monstrous string $tmp. Then, finally, when you've loaded the whole [bleep] file into RAM in $tmp, you just echo it out, right? Don't do that. :-) while (!feof($fp)){ echo fread($fp, 2048); } And if the OP is opening the file anyway, he might as well use readfile() instead. /Per Jessen, Zürich Well I got it to work, much thanks to Richard Lynch, but now everytime I download a file, it is corrupt. For example, when I download small .rar file, just to test, it is always corrupt ('Unexpected end of archive'). I also cleared my browser cache just to be sure, but same problem. Here is the code as it stands. I just can't get my head around why it wouldn't be working as it is... // Get the full premium link, and store it in $full_link after the redirect. *Surely* there is an easier way to get redirections? if(strlen($link)>0) { $url = @parse_url($link); $fp = @fsockopen($url['host'], 80, $errno, $errstr); if (!$fp) { $errormsg = "Error: $errstr, please try again later."; echo $errormsg; exit; } $vars = "dl.start=PREMIUM&uri={$url['path']}&directstart=1"; $out = "POST {$url['path']} HTTP/1.1\r\n"; $out .= "Host: {$url['host']}\r\n"; $out .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $out .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $out .= "Content-Type: application/x-www-form-urlencoded\r\n"; $out .= "Content-Length: ".strlen($vars)."\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); fwrite($fp, $out.$vars); while (!feof($fp)) { $string .= fgets($fp, 256); } //Tell us what data is returned //print($string); @fclose($fp); if (stristr($string, "Location:")) { $redirect = trim(cut_str($string, "Location:", "\n")); $full_link = addslashes(trim($redirect)); } //print($string); //print("".$full_link.""); if ($full_link) { // Get info about the file we want to download: $furl = parse_url($full_link); $fvars = "dl.start=PREMIUM&uri={$furl['path']}&directstart=1"; $head = "Host: {$furl['host']}\r\n"; $head .= "User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\r\n"; $head .= "Authorization: Basic ".base64_encode("{$rslogin}:{$rspass}")."\r\n"; $head .= "Content-Type: application/x-www-form-urlencoded\r\n"; $head .= "Content-Length: ".strlen($fvars)."\r\n"; $head .= "Connection: close\r\n\r\n"; $fp = @fsockopen($furl['host'], 80, $errno, $errstr); if (!$fp) { echo "The script says $errstr, please try again later."; exit; } fwrite($fp, "POST {$furl['path']} HTTP/1.1\r\n"); fwrite($fp, $head.$fvars); while (!feof($fp)) { //Keep reading the info until we get the filename and size from the returned Header - is there no easy way //of doing this? I also don't like the way I have to 'find' the redirected link (above).?? $tmp .= fgets($fp, 256); $d = explode("\r\n\r\n", $tmp); // I tried changing this to if ($d), { etc.., (instead of $d[1]) and the download of the rar file *wasn't* corrupt, it just had a filetype of x-rar-compressed instead of //application/octet-stream, and the filesize was 'unknown' - now this is just confusing me...! So i think (and guess) the problem of the file corruption is here, //because it must add some data to the filestream which corrupts it. Darn. if($d[1]) { preg_match("#filename=(.+?)\n#", $tmp, $fname); preg_match("#Content-Length: (.+?)\n#", $tmp, $fsize); $h['filename'] = $fname[1] != "" ? $fname[1] : basename($furl['path']); $h['fsize'] = $fsize[1]; break; } } @fclose($fp); $filename = $h['filename']; $fsize = $h['fsize']; //Now automatically download the file: @header("Cache-Control:"); @header("Cache-Control: public"); @header("Content-Type: application/octet-stream"); @header("Content-Disposition: