On Mon, Apr 09, 2001 at 10:03:31AM -0400, Dan Sugalski wrote:
> While I don't know if Larry will mandate it, I would like this code:
>    open PAGE, "http://www.perl.org";
>    while (<PAGE>) {
>          print $_;
>    }
> to dump the HTML for the main page of www.perl.org to get dumped to stdout.

Well, this seems innocent enough, but how far do you want to stretch it?

Should this work?
 use lib "http://my.site.org/lib";

If not; why? (except security issues)

what about 
 if (-r "http://www.perl.com/") {...}
or
 if (-M "http://www.perl.com/" < -M "http://www.python.org/") {...}

should 
 opendir (FTP, "ftp://sunsite.uio.no/");
work as expected?


Should URLs behave as much like regular files/directories as possible,
or should they be limited to a small set of operations (like open()).

-- 
Trond Michelsen

Reply via email to