Forwarded message from " Bruce Burhans" <[EMAIL PROTECTED]>:

>  Lynx people,
>
>   Below is the first draft of
>  a possible solution to the "graphics-on-websites
>  problem."
>   Set up a spider that would go
>through the websites and compile a file on
>each web-page's graphic content, with the fol-
>lowing possible contents-
>
>  1)  Graphics non-existent
>
>
>  { For the rest,where relevant, the
>  position of each graphic would be derived from
>  the HTML and indicated by a number on the
>displayed page of text, but the actual description
>  etc., would be on a virtual console...}
>
>
>  2)  G  irrelevant, followed by a brief
>description of them, like "guy in suit, smiling."
>
>  3)  G  relevant, but a description will
>do just fine:  ____________________________ .
>
>  4)  G  important, and here it is in
>a format that the simple program you downloaded
>free last week can use to turn it into a usable
>picture. Grayshade would do, preferably in text-
>mode if possible.
>
>
>  The directory would be available free
>to anyone who wanted it, and updated regularly.
>
>  So when I was browsing with Lynx, I could
>just run another simple program that would put
>the correct file on tty2, which I could check
>with a keystroke.
>
>  The program could also make a note when
>the file for a page didn't exist, and record the
>URL to be sent to the Graphics Directory site...
>
>-----------------------------------------------------
>
>  If this is a hare-brained idea, please, when you
>get done telling me what a stupid moron I am,
>explain just exactly *why* I am....
>  I realize that there are a lot of sites, but
>I also realize that they will always be coming up
>with new things to defeat Lynx and w3m, et al.....
>  So maybe doing everything at a single
>location (or three) is the sensible approach.
>  And if there's a better solution that i
>can find or help make happen, let me know, please!
>
>
>V/R
>
>Bruce<+>  [EMAIL PROTECTED]



; To UNSUBSCRIBE: Send "unsubscribe lynx-dev" to [EMAIL PROTECTED]

Reply via email to