Hi,
I'm trying to limit the amount of content retrieved with fgets when pulling
data from a url. I'm generating page extracts and only need the first 2K to
4K. What I'm finding is that when I limit fgets, I get NOTHING but run
within a while loop, I get the whole string.
What works:
//retrieve the page from the Internet
$file = @fopen ("$url", "r");
// bcg need to add error handling so if website is offline, page is
still logged and retrieved later
if (!$file) {
//log error
//need to write logging code
}
else {
//load the file into a string var
while (!feof ($file)) {
$string = $string . fgets($file, 4096);
}
}
What fails:
//retrieve the page from the Internet
$file = @fopen ("$url", "r");
// bcg need to add error handling so if website is offline, page is
still logged and retrieved later
if (!$file) {
//log error
//need to write logging code
}
else {
//load the file into a string var
while (!feof ($file)) {
$string = $string . fgets($file, 4096);
break;
}
}
I've done it with and without the while loop, tried if tests and such. Its
strange.
Thanks in advance
J. Scott Johnson
Physical
* * * * * * * * * * * * * * * * * * * * * * * * * *
80 Spring Road
Nahant, MA 01908
Phone:
* * * * * * * * * * * * * * * * * * * * * * * * * *
781 592 0262 - home
617 970 4719 - cell
Virtual:
* * * * * * * * * * * * * * * * * * * * * * * * * *
[EMAIL PROTECTED]
http://www.fuzzygroup.com/
Yahoo IM: fuzzygroup
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php