On Dec 4, 2003, at 7:23 AM, Christopher G Tantalo wrote: [..]
This does seem like a good idea, but for some of us who
can not access the web from work, this just makes it worse.
[..]

Personally I am neutralish in this debate about the bot,
but I think that Christopher has put the scary part on
the table that we should all be worried about.

Most of us started with the premise:

        have you done a web-search?
        have you checked the c-pan?
        have you checked the On-Line Perl Docs at...

since for us, we think of the web as the store house
of knowledge that we can pluck tastey bits from.

There are folks I know who are 'living on' UUCP
connections, the old fashion way, for whom an
HTTP connection is not going to be happening. So
yes, I can come up with at least one scenario in
which Christopher's case Could Be Occurring.

There is some room to negotiate solutions here.

I adopted the strategy of hanging out code sample
on my webPage to save on the throughput to the list.
{ and to let me find my answers later on, when I
needed to get some Little Arcanea that I rarely use... }

But clearly as Christopher has pointed out that
may not be the panacea that I had hoped for. Also
as those watching Google Performance will note,
at best they can scan some sites maybe once a month
if that, so there are issues with our root assumptions
that we need to work on...

On Christopher's side of the line, to argue with
his management that for 'professional reasons'
web-access would be an improvement in productivity,
since clearly being able to 'google it' will help
answer various technical issues much faster.

Clearly we must all support a campaign slogan Like

        Bigger Routers! Faster Pipes!
        A FOT ( fiber optic Terminal ) in
                every garage and Work Place.

Until then I guess we will just have to keep improvising.

ciao
drieux

---


-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to