Maybe one day, something like Peersm combined with [1] in order to
follow/or use [2] and [3] (don't focus on google developing this here,
these concepts are the only way to really secure a web page)
Basically you fetch the web page with something like Peersm, then
retarget it in a sandboxed context (sandboxed window like Caja or
node-dom inside browsers can do), so the website appears inside your
browser like a standalone widget/gadget (and certainly not an iframe)
and then you parse the links and fetch the resources with the same
techno used by Peersm (ie Tor protocol inside the browser).
Once you have captured the initial web page, you can do all this offline
and queue the fetching.
This must work without hacking inside the browser, unfortunately you can
not easily say to the browser "fetch everything using 'my secure function'".
It's very difficult to do but not impossible and some advanced features
will not work due to the same origin policy but that's not an issue for
the intended use.
Coming back to the origin of this thread, it's more easy to use Peersm
as it is and have some kind of distributed P2P hidden services with
difficult end to end corelation possibilities, even if we don't advise
to use it to do strange things.
[1] https://github.com/Ayms/node-dom
[2] https://code.google.com/p/google-caja/wiki/SES
[3]
http://static.googleusercontent.com/external_content/untrusted_dlcp/research.google.com/en//pubs/archive/37199.pdf
Le 03/07/2014 07:04, grarpamp a écrit :
On Wed, Jul 2, 2014 at 7:18 PM, Helder Ribeiro <hel...@discor.de> wrote:
On Sun, Jun 29, 2014 at 9:58 PM, Seth David Schoen <sch...@eff.org> wrote:
Then a question is whether users would want to use a service that takes,
say, several hours to act on or answer their queries (and whether the
amount of padding data required to thwart end-to-end traffic analysis
is acceptable).
I probably missed some context in thread. Link padding doesn't imply
or have a tie to high[er] latency (other than minimal processing overhead).
It's just the usual committed bandwidth, but always full, with wheat,
or backed by chaff when there's not enough wheat to fill it.
High-latency web browsing is actually a great use case and could
benefit from the extra security.
Apps like Pocket (http://getpocket.com/) work as a "read it later"
queue, downloading things for offline reading.
I think it was Freenet where 'web' (page/browsing) was modeled
as a non-real-time-interactive, retrievable (and updateable) object.
Essentially documents. But were delivered in real time over the net.
Torrents seem similar... queing, updatable, latency tolerant. Though
there's no 'hours' delay storage buffer nodes between actual source
and sink either.
Besides mail mixes, what systems use such formal buffers in between?
--
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms
--
tor-talk mailing list - tor-talk@lists.torproject.org
To unsubscribe or change other settings go to
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk