If the site can be a few minutes behind, (say 15-30 minutes), then what I recommend is to create a caching script that will update the necessary files if the md5 checksum has changed at all (or a specified time period has past). Then store those files locally, and run local copies of the files. Your performance will be much better than if you have to request the page from another server every time. You could run this script every 15-30 minutes depending on your needs via a cron job.

Joseph

Ashley Sheridan wrote:
Hi,

I need to replicate a site on another domain, and in this case, an
iframe won't really do, as I need to remove some of the graphics, etc
around the content. The owner of the site I'm needing to copy has asked
for the site to be duplicated, and unfortunately in this case, because
of the CMS he's used (which is owned by the hosting he uses) I need a
way to have the site replicated on an already existing domain as a
microsite, but in a way that it is always up-to-date.

I'm fine using Curl to grab the site, and even alter the content that is
returned, but I was thinking about a caching mechanism. Has anyone any
suggestions on this?

Thanks,
Ash
http://www.ashleysheridan.co.uk




--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to