On Sunday 27 July 2003 06:38 am, Jarmo Maki wrote:
> Fortunately, there is a (relatively) simple solution to all these problems.
> When FProxy retrieves a html file, it is passed through an anonymity
> filter. This filter tries to find all content that could make the browser
> compromise the users anonymity, mainly by blocking any non-Freenet links
> and preventing the loading of images from reqular internet. I suggest that,
> after this phase, we start downloading everything this page links to in
> background, in random order. For example, if the page has three hundred img
> tags and two href links, we request these all in random order (the amount
> of parallel request could be adjusted based on current node load, and no,
> we don't follow the links in any pages we downloaded). This performs the
> same function as the ZIP files (binds data together)
NO IT DOESN'T. The data is still seperate, and there's still the problem of a 
lot of little pieces to find - and fall out. This doesn't mean it's a bad 
idea, just that it can't do the job of containers.

-- 
"I love deadlines. I love the whooshing sound they make as they go by."
        - Douglas Adams
Nick Tarleton - [EMAIL PROTECTED] - PGP key available

_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to