That’s more or less the caching architecture used for the fat client.. Great thoughts nonetheless.. wonder why it wasn't extended to the thin client..

Joe

-----Original Message----- From: John Baker Sent: Sunday, March 04, 2012 4:18 AM Newsgroups: public.remedy.arsystem.general
To: arslist@ARSLIST.ORG
Subject: Automate Cache Flush - RANT

Andrew,

I don't understand why this problem hasn't been solved either, given it's not difficult to solve. At JSS, we have to restart AR System and Mid Tier when performing webex installations of SSO Plugin, and we often spend more time waiting for AR System to restart than we do installing the product. Whilst we can't fix AR System, I did ponder how best to resolve Mid Tier's caching issues and stumbled upon a solution when I recently wrote to ARSlist suggesting workflow should be cached as flat files.

Every form has a set of dependencies and a last modified time, and therefore each form should be held locally as a javascript file. Consider the home page; the Javascript on my Mid Tier is served by the following URL:

http://host:8080/arsys/forms/192.168.0.54/Home+Page/Default+Admin+View/form.js/3d2a292f.js?format=html

(The URL is bizarre; format=html yet the content is Javascript?)

You can login to Mid Tier, go to the home page, open another tab and paste in this URL to see the JS:

http://host:8080/arsys/forms/192.168.0.54/Home+Page/Default+Admin+View/form.js

I've looked at the contents of my Mid Tier and I can not find this JS file, so I can only assume it's being generated on the fly and hence the workflow is being held in memory. And that seems to describe your problem: Mid Tier is loading all of ITSM into memory, which is extremely inefficient and results in huge VM sizes.

The solution is remarkably simple: Build a local cache of Javascript files for each form/view as a user navigates ITSM. Once the Javascript file has been written to disc, it only needs to check the last update time on a form and associated workflow before serving the Javascript (in development mode), or simply serve it without checking for workflow changes in production.

With such a small change in design, you would see the following benefits:

* Instant Mid Tier start up times, no "pre-caching", low memory footprint (like 256Mb VMs), much better performance, etc.

* Cached Javascript could be moved from Mid Tier to Mid Tier, so your problem with 2+ Mid Tiers would go away.

* Transparency. You can delete a form's Javascript confident that it has been removed from the cache (because Mid Tier would notice it wasn't there and rebuild it).

* Provision for a decent debugger. If the workflow to Javascript engine is improved, so well formed and readable Javascript is written, you could attach Firebug or one of many Javascript debuggers and start debugging workflow with the same level of precision as developers using other platforms. Sure, this is possible right now, but you can't go and modify the Javascript.

* Writing workflow without writing workflow: If you've got local Javascript, dab hands would be able to quickly try out changes without having to get out developer studio. When they're happy with the changes, they can enter them as normal.

The irony is that BMC are 90% of the way there because they've got all the pieces of the puzzle: they just need to re-write the caching routine again to store files locally, and none of this is a terribly difficult task.


John

--
SSO Plugin for BMC ITSM, ITBM, Dashboards and Analytics
http://www.javasystemsolutions.com
_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
attend wwrug12 www.wwrug12.com ARSList: "Where the Answers Are"

Reply via email to