Brendan and I have been exchanging some emails about a possible solution for implementing off-line web apps. This mail describes the basic concept. I'm all for any input on the easiest way to achieve this using the Mozilla code base. I'm working towards a solution that could be shipped with Firefox if all the kinks can be worked out. ----------------------------------------------
The concept is fairly simple, you run a proxy server on the local machine. Mozilla is set up to always talk to the local proxy server. The proxy server also implements FastCGI. Domains can download sandboxed Javascript and run it as a FastCGI app off from the proxy server. The proxy server will intercept requests for the domain and route then through the FCGI app first before sending them on out to the Internet. This model provides a transparent solution for offline web apps. Domains can be assigned space and given access to storage or sqlite. You could build a local version of gmail that would have exactly the same UI as the Internet one. When reconnecting to the Internet the app can sync up with the main servers. The proxy server is always running, it is a separate app from the browser. This provides for long running programs like RSS polling or a chat client. It can also be used to manage things like replication of large amounts of photos/music back to web servers. The proxy server can be designed to use low levels of memory when not active. Another idea would be to split web apps with a MVC (model, view, controller) architecture. The model code stays back on the main server and the controller code is migrated into the proxy sandbox if available. The MVC split is much more powerful than the current AJAX system since it provides a clean way to save a lot of state info. This design is well suited to the XULrunner / Firefox shift planned for next year. With a XULrunner based FIrefox it 's not too hard to build the proxy server and fastcgi layer using the XULrunner components. The separate proxy model can equally serve IE, Mozilla or any other browser. What do you think? The concept is a blend of Net Nanny and Akami's edge services. Is this something worth pursuing or has it already been tried and I never noticed? Brendan points out BEA's alchemy proposal.... I go read about it ... Alchemy is beating around the same bush but not quite exactly the same concept. He didn't generalize enough. He structures the data store and synchronization and assumes you want a data store and synchronization. He also talks about calling back to web services. It is not clear if he intended for any server to download code into Alchemy or if the code has to come via BEA. I was working from an entirely general view point. The proxy talks standardized FCGI to the local FCGI app. This app is migratory, it can run at the server or on the client. I posted the code for serving FCGI via xpcshell on the js-eng list. Each domain is isolated into it's own process, that makes it easy to apply cross site scripting controls. There is no required structure for the FCGI code. It can access the safe Javascript/XPCOM API and do what it wants. There needs to be a UI via the browser for limiting how much disk space the FCGI app can consume, but it can do what it wants with that disk space - keep logs or run a sqllite db in it. If you want to synchronize, supply your own code. You can always chose to install non-safe FCGI app versions too. Something like xmlhttprequest can pragmatically talk to the server from the FCGI app or just open up a socket to the server assuming that the cross-site scripting rules are applied. Some apps may choose to run entirely on the local system. Off-line apps need to have a special mode if they can't contact the remote server. -- Jon Smirl [EMAIL PROTECTED] _______________________________________________ dev-tech-network mailing list [email protected] https://lists.mozilla.org/listinfo/dev-tech-network
