On Wednesday, 30 April 2014 at 15:27:48 UTC, Nick Sabalausky wrote:
That definitely is the direction things are moving right now. Granted, I don't like it, but you're right it's undoubtedly the popular direction and it's unlikely to slow or reverse anytime soon.

I'm not sure if I like it either, but I think websites are getting more usable now. For a while it was a shitty stuttering mess of HTML and JS that made me longing for an AntiWeb browser with community maintained per-site AI that turns the horrible HTML-mess into semantic markup that you can style yourself. I actually have a file called antiweb.d here… ;)

I also had high hopes for XSLT. I remember requiring studentprojects to serve XML from the server, and transform it to HTML using XSLT in the browser back in 2002 or so. And XSLT support was actually quite good, at least until the mobile shit hit the fan. Unfortunately connections were still slow so XSLT based rendering had to wait until the whole XML was downloaded. Today I think it might work out quite nicely, but I doubt anyone even remembers that browser can do XSLT today. Actually, I am not even sure if they all still support it?

The weird thing is that SEO and search engine priorities are the ones that keep the dynamic websites from going fully dynamic by their anti-dynamic measures (punishing non-static content) and they are also the ones that are pushing semantic markup such as itemscope/itemprop microdata.

On the other side of the fence the Wordpress authors are having a lot of power. Whatever Wordpress makes easy will dominate a large portion of the web. I think that is so sad, because the Wordpress codebase is… err… junk. I am not even going to use the term «a pile of junk» which would suggest that there is some sense of structure to it. I think it is more like a scattered mutating spaghetti dinner gone terribly wrong, slowly emerging from every toilet in every household taking over the earth… like the classic horror movies from the 1950s.

JS can definitely help improve the UX of form validation, no doubt about that, but it's important to remember that server-side validation is still necessary anyway, regardless of what you do on the client.

Yup. So a must have is a good infrastructure for specifying database invariants and transactions. Ideally it should execute like a stored procedure thus leaving the server logic pretty clean.

What I *do* love is having a canonical table defining my entire HTTP interface in one easy location. The extra typing or non-DRYness of that is a mere triviality in my experience (and I'm normally a huge DRY buff).

Yep, it also acts like always-up-to-date documentation when you come back to the source code 6 months later trying to figure out the main structure.

So unless the page's rate of comment "submissions/edits" approaches the rate of comment "views" (unlikely...except maybe on YouTube ;) ), then it's best to re-generate upon posts/edits and then cache that. So you still get caching benefits, but with no need to make *all* the clients duplicate the exact same page-generating effort as each other upon every viewing.

Well, I would probably use JS… ;-)

But I am pragmatic. Caching and pregeneration can lead to bugs and complications. So it is usually a good idea to just do a dynamic version first and then add caching when needed.

I also sometimes use a dynamic template during development, and then just save a static version for release if I know that it won't change.

I'll take those, plus a large vanilla latte, please. :) "Thank you, come again!"

You're welcome!

I think it is realistic now for smaller sites (say 1-8 servers) where you have enough RAM to hold perhaps 10 times the information the site will ever provide. One should be able to sync 8 servers that have relative few write operations easily. So, reading the log might take some time during startup, but with an efficient format… it probably could complete quickly for 1GB of data.

Reply via email to