I've seen web pages mention the use of wget to back up websites, and mention 
using either MacPorts or Homebrew to install wget (which doesn't come with 
macOS).

They also seem to suggest that rsync over ssh might be better (no credentials 
in the clear, unlike wget using ftp). MacOS does include rsync, but one might 
want a newer version for some reason.

Both being command line and capable of other uses, one still needs to know how 
to use them. :-)

If the website includes a database, one may also need ssh access to run a 
database dumping tool (since a database can be a moving target, you may not 
want to simply copy the underlying files).

Not something I've done, just the result of a couple of minutes of googling. 
Me, if there were no databases, I'd set up ssh access and use tar or cpio or 
rsync to do the copying. Rsync is likely more efficient in terms of bandwidth, 
but I have seen it choke on synchronizing REALLY large directory trees.

> On Mar 12, 2023, at 20:49, Craig Treleaven <ctrelea...@cogeco.ca> wrote:
> 
>> On Mar 12, 2023, at 5:35 PM, Sarah Zinsmeister <sarah.zinsmeis...@web.de 
>> <mailto:sarah.zinsmeis...@web.de>> wrote:
>> 
>> I actually thought this was a program to back up websites
> 
> As others have said, you can leave Macports installed without any risk.  Or 
> uninstall it if you don’t intend to use it.
> 
> However, perhaps we can help you achieve the goal of backing up websites.  If 
> you found this idea on the web, could you post a link so that we can help you 
> work through what may be required?
> 
> Craig
> 

Reply via email to