Hello Mike,

Thank you for your help,

As I wrote already this issue is mainly because of search engines
incompatibility with dynamic content sites (to be more exact - with
urls containing get parameters, in my case ex. index.shtml?lang=en&menu_id=23)

Which of your described solution would you suggest for my situation ?

Turck MMCache already installed. Now I need to do something with my urls.
There are also problem emulating search engine friendly urls using "/"
instead of "&" because I use SSI, so I can not use post method in
forms, while using GET it will be a bit difficult to handle all params
correctly, or I'm wrong ?

Thanks again.

>>Dear All,
>>
>>Does anybody have any solutions, which makes possible to produce static
>>pages of all dynamic cms once a day and can be easily integrated into
>>already made site?

MM> Why do you need to do this? Is it because of hosting restrictions,
MM> performance concerns, or portability/mirroring (which is a form of hoting
MM> restriction I suppose)?

MM> There are a number of ways to approach this problem...

MM> If your sole concern is performance, judicious use of caching could be
MM> your answer. You can cache your code using PHP Accelerator or Turck
MM> MMCache, which helps with load times, or you can cache your data by
MM> implementing a caching layer between your application and your database. I
MM> believe PEAR has some classes designed for this. They basically all boil
MM> down to memoizing function return values with the serialize/unserialize
MM> functions, and storing those results in files. I have used this method in
MM> applications to great effect - a cascading cache that stores database
MM> results, page components like navigation areas, and entire pages is a
MM> great performance enhancer, but you need to know how to mark and remove
MM> stale data dynamically.

MM> If you need a static version of your site due to hosting restrictions, you
MM> can used a spider such as wget (I think has been mentioned in this thread)
MM> to crawl your site and generate a local copy for you. Wget is an excellent
MM> one, because it has options like --page-requisites and --convert-links
MM> which make it easy to generate a self contained site mirror. This approach
MM> requires that your dynamic links all have a slash-syntax, like
MM> index.php/foo/bar/etc/. It's very easily implementable in a series of Make
MM> rules - I use this method for www.stamen.com, where rolling out a new
MM> version of the site is a simple matter of 'make clean; make live'.

MM> You can also use the 'page fault' method, which is my personal favorite.
MM> Let Apache's mod_rewrite handle your caching and URL-rewriting:
MM>         1) user requests page foo/index.html
MM>         2) if foo/index.html does not exist in filesystem, Apache knows to
MM>            redirect this request to bar.php
MM>         3) bar.php performs actions needed to generate contents of
MM>            foo/index.html, and also creates the file
MM>         4) bar.php returns contents of foo/index.html
MM>         5) subsequent requests for foo/index.html just return that file,
MM>            bypassing PHP entirely.
MM> This one's sort of a balancing act though. It has been suggested here that
MM> you can use Apache's ErrorDocument directive to direct the request to
MM> bar.php, but this has the unfortunate side-effect of returning a 404
MM> status code with the response. Not really a problem with a normal browser,
MM> but when those responses are (for example) XML files used by flash, the
MM> 404 causes it to error out regardless of the content of the response. A
MM> better method is to use a string of RewriteCond's, like so:
MM>         RewriteCond     %{REQUEST_FILENAME}     !-f     [OR]
MM>         RewriteCond     %{REQUEST_FILENAME}     -d
MM>         RewriteCond     %{REQUEST_FILENAME}/index.html  !-f
MM>         RewriteRule     ^.*$    bar.php
MM> Obviously, this method is totally incompatible with any form of actual
MM> dynamic content, but you're asking for ways to generate static output, so
MM> I assume that's not an issue. The difficulty with this one is the same as
MM> with any caching system as above - finding and flushing stale data. I do
MM> this by rolling the cache deletion code into the editing functions, but
MM> you can also use a cronjob to find and flush files older than some cutoff
MM> time period.

MM> ---------------------------------------------------------------------
MM> michal migurski- contact info and pgp key:
MM> sf/ca            http://mike.teczno.com/contact.html

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to