j...@trillian.mit.edu wrote:
> I had a similar case recently. I've helped a few nonprofits build web
> sites, and several have started off looking into Drupal, Joomla, etc.
> After a month or so of this, with nothing working, I've combined a
> few scripts that I've collected or written anew wi
Plus one for HTTrack. I used it a couple of months ago to convert a
terrible Joomla hacked site to HTML. It was a pain to use at first,
like having to use Firefox, but it worked as advertised.
Hope that helps.
On Tue, Jan 7, 2014 at 10:34 PM, Greg Rundlett (freephile)
wrote:
> Hi Bill,
>
> GPL
Hi Kent,
What do you mean by variables being public to the internet? Nobody
can directly access them from what I understand. Sanitize in and out
you should be fine no?
Thanks.
On Tue, Jan 7, 2014 at 6:55 PM, Kent Borg wrote:
> On 01/07/2014 06:46 PM, Bill Horne wrote:
>>
>> Thanks to all for
That sounds like a great idea. HTML won't get you hacked. Less work
on your end.
On Tue, Jan 7, 2014 at 6:46 PM, Bill Horne wrote:
> On 1/6/2014 11:30 PM, Bill Horne wrote:
>>
>> Thanks for reading this.
>>
>> I'm a member of the Big-8 Board, which decides what Usenet groups are
>> created and
Also, I just discovered a MediaWiki extension written by Tim Starling that
may suit your needs. As the name implies, its for dumping to HTML.
http://www.mediawiki.org/wiki/Extension:DumpHTML
As for processing the XML produced by "export" or MediaWiki dump tools,
here is info on that XML schema
h
Bill Horne wrote:
| On 1/6/2014 11:30 PM, Bill Horne wrote:
| > Thanks for reading this.
| >
| > I'm a member of the Big-8 Board, which decides what Usenet groups are
| > created and deleted. We have both technical and non-technical
| > members, and we've been using MediaWiki for the board's websi
Hi Bill,
GPL - licensed HTTrack Website Copier works well (http://www.httrack.com/).
I have not tried it on a MediaWiki site, but it's pretty adept at copying
websites including dynamically generated websites.
They say: "It allows you to download a World Wide Web site from the
Internet to a loca
Matthew Gillen wrote:
> wget -k -m -np http://mysite
I create an "emergency backup" static version of dynamic sites using:
wget -q -N -r -l inf -p -k --adjust-extension http://mysite
The option -m is equivalent to "-r -N -l inf --no-remove-listing", but
I didn't want --no-remove-listing (I do
Daniel Barrett wrote:
For instance, you can write a simple script to hit Special:AllPages
(which links to every article on the wiki), and dump each page to HTML
with curl or wget. (Special:AllPages displays only N links at a time,
Yes, but that's not humanly-readable. It's a dynamically generat
On January 7, 2014, Richard Pieri wrote:
>Remember that I wrote how wikis have a spate of problems? This is the
>biggest one. There's no way to dump a MediaWiki in a humanly-readable
>form. There just isn't.
Erm... actually, it's perfectly doable.
For instance, you can write a simple script to
Matthew Gillen wrote:
wget -k -m -np http://mysite
I've tried this. It's messy at best. Wiki pages aren't static HTML.
They're dynamically generated and they come with all sorts of style
sheets and embedded scripts. Yes, you can get the text but it'll be text
as rendered by a wiki. It tak
On 1/7/2014 7:28 PM, Matthew Gillen wrote:
> On 1/7/2014 6:49 PM, Bill Horne wrote:
>> I need to copy the contents of a wiki into static pages, so please
>> recommend a good web-crawler that can download an existing site into
>> static content pages. It needs to run on Debian 6.0.
>
> wget -k -m
On 1/7/2014 6:49 PM, Bill Horne wrote:
> I need to copy the contents of a wiki into static pages, so please
> recommend a good web-crawler that can download an existing site into
> static content pages. It needs to run on Debian 6.0.
wget -k -m -np http://mysite
is what I used to use. -k conve
Bill Horne wrote:
I need to copy the contents of a wiki into static pages, so please
recommend a good web-crawler that can download an existing site into
static content pages. It needs to run on Debian 6.0.
Remember that I wrote how wikis have a spate of problems? This is the
biggest one. Ther
On 01/07/2014 06:46 PM, Bill Horne wrote:
Thanks to all for your help: I've just gotten off the phone, and the
decision has been made to go in a different direction. We have a
volunteer who wants to learn "native" HTML, and so we'll be setting up
a "static" site without a CMS.
More secure tha
I need to copy the contents of a wiki into static pages, so please
recommend a good web-crawler that can download an existing site into
static content pages. It needs to run on Debian 6.0.
Bill
--
Bill Horne
339-364-8487
___
Discuss mailing list
Dis
On 1/6/2014 11:30 PM, Bill Horne wrote:
Thanks for reading this.
I'm a member of the Big-8 Board, which decides what Usenet groups are
created and deleted. We have both technical and non-technical
members, and we've been using MediaWiki for the board's website
(http://www.big-8.org/) until n
Bill Horne wrote:
> ...we've been using MediaWiki for the board's website
> ...but we have to move the site to a new
> server which doesn't offer it.
>
> So, the question is "What's the best compromise between ease-of-use,
> learning curve, and maintainability if we have to choose between
> Jooml
On 1/7/2014 12:24 PM, Daniel Barrett wrote:
On January 6, 2014, Bill Horne wrote:
...we've been using MediaWiki for the board's website
(http://www.big-8.org/) until now, but we have to move the site to a new
server which doesn't offer it.
So, the question is "What's the best compromise between
FYI, I did a google search for LetoDMS, and I found another one called SeedDMS
that states
> SeedDMS is the continuation of LetoDMS because it has lost its main
> developer.
On Jan 7, 2014, at 1:03 PM, Richard Pieri wrote:
> Daniel Barrett wrote:
>> Have you considered not switching platfo
Daniel Barrett wrote:
Have you considered not switching platforms? I would think the cost of
moving all your mediawiki content to a new platform and retraining all
your users would far exceed the price of a managed VPS on
MediaWiki, and Wikis in general, have a spate of problems that make them
On January 6, 2014, Bill Horne wrote:
>...we've been using MediaWiki for the board's website
>(http://www.big-8.org/) until now, but we have to move the site to a new
>server which doesn't offer it.
>So, the question is "What's the best compromise between ease-of-use,
>learning curve, and mainta
I use Drupal. It is easy to start and there is a lot you can do.
> Thanks for reading this.
>
> I'm a member of the Big-8 Board, which decides what Usenet groups are
> created and deleted. We have both technical and non-technical members,
> and we've been using MediaWiki for the board's website
23 matches
Mail list logo