Nice idea Ted, and very close, but no banana - unfortunately wget
doesn't support NTLM authentication.  curl does support that but does
not do recursive downloading.  I tried a few open source link
checkers, but either they don't support NTLM (LinkChecker and
Webcheck) or don't give me fine grained enough control to prevent
generating 10s of 1000s of links to trawl through (Xenu).

So at the moment it's back to my original idea, which would be to get
each Wiki page from the DB (this is easy since I have physical access
to the DB), then render them to HTML, and finally check each link in
the rendered page.  This means I can easily limit it to just pages
from the wiki.

So could anyone throw me a quick bit of Trac code I could use to
render the pages once I have them from the DB?  It's only this middle
part I now have trouble with, it has to be Trac code but something
must exist to do this already.  I found
trac.wiki.formatter.format_to_html (sounds like the right thing) but
couldn't work out how to instantiate a resource for it.  I got this
far ...

wikidom = wiki_page_from_DB
env = trac.env.Environment(path = '<path to my trac env>', create =
False)
resource = ''
context = trac.mimeview.api.Context(resource)
trac.wiki.formatter.format_to_html(env, context, wikidom)

Any help much appreciated folks...

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Trac 
Users" group.
To post to this group, send email to trac-users@googlegroups.com
To unsubscribe from this group, send email to 
trac-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/trac-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to