Hi all,
I am writing a greasemonkey user script which uses jquery for most
things it does.
the user script is supposed to work on a page with lots of elements i
want to discard so that focusing on the main content is easier. this
part is easy, i just select the div containing what i want and store
its content in a variable, then i remove everything in body, and
finally append a div with the content from the variable. it works.
but i have another objective too. the page is very slow as it has lots
of js, css, and images which i discard, but ... AFTER they are fetched
from the server! this is ridiculous, i load the page and make 48 HTTP
requests to fetch things i'll be trowing away seconds latter.
the sensible way to do this is fetching ONLY the html, and BEFORE any
of the linked resources start downloading the reference to them is
discarded preventing expensive and lengthy HTTP requests.
i know that this sort of thing is possible with libcurl, but is it
impossible without it?
i have tried several approaches:
1. i put the $('html').children().remove(); line before  and outside
$(document).ready(function(){ but still all resources are fetched;
2. i put a window.stop() at the begining of the user script, it stops
the rest of the script, but only after all resources are fetched;
i concluded that the user script itself is evaluated only after full
page load so anything inside it should be wrapped in a <timemachine
goback="10 seconds"> tag to work ;)
so i started to look elsewhere and tried this:
3. i made a minimal empty page including only jquery and the js code
to fetch the page i want to improve via ajax. i wanted to fetch the
page's html as text, discard all the referenced and content from it
and then add dom elements with what remains. this time i got an error
saying an uncaught exception has occured and that permission was
denied to call method XMLHttpRequest.open ... PUFFF! i tried both
opening the page from file, and opening it from localhost, no way,
won't work.

Reply via email to