Howdy!

On 2011-04-10 19:06:52 -0700, Ian Bicking said:

There's a significant danger that you'll be creating a configuration management tool at that point, not simply a web application description.

Unless you have the tooling to manage the applications, there's no point having a "standard" for them. Part of that tooling will be some form of configuration management allowing you to determine the requirements and configuration of an application /prior/ to installation. Better to have an application rejected up-front ("Hey, this needs my social insurance number? Hells no!") then after it's already been extracted and potentially littered the landscape with its children.

The escape valve in Silver Lining for these sort of things is services, which can kind of implement anything, and presumably ad hoc services could be allowed for.

Generic services are useful, but not useful enough.

You create a build process as part of the deployment (and development and everything else), which I think is a bad idea.

Please elaborate. There is no requirement for you to use the "application packaging format" and associated tools (such as an application server) during development. In fact, like 2to3, that type of process would only slow things down to the point of uselessness. That's not what I'm suggesting at all.

My model does not use setup.py as the basis for the process (you could build a tool that uses setup.py, but it would be more a development methodology than a part of the packaging).

I know. And the end result is you may have to massage .pth files yourself. If a tool requires you to, at any point during "normal operation", hand modify internal files… that tool has failed at its job. One does not go mucking about in your Git repo's .git/ folder, as an example.

How do you build a release and upload it to PyPi? Upload docs to packages.python.org? setup.py commands. It's a convienent hook with access to metadata in a convienent way that would make an excellent "let's make a release!" type of command.

Also lots of libraries don't work when zipped, and an application is typically an aggregate of many libraries, so zipping everything just adds a step that probably has to be undone later.

Of course it has to be un-done later. I had thought I had made that quite clear in the gist. (Core Operation, point 1, possibly others.)

If a deploy process uses zip file that's fine, but adding zipping to deployment processes that don't care for zip files is needless overhead.  A directory of files is the most general case.  It's also something a developer can manipulate, so you don't get a mismatch between developers of applications and people deploying applications -- they can use the exact same system and format.

So, how do you push the updated application around? Using a full directory tree leaves you with Rsync and SFTP, possibly various SCM methods, but then you'd need a distinct repo (or rootless branch) just for releasing and you've already mentioned your dislike for SCM-based deployment models.

Zip files are universal -- to the point that most modern operating systems treat zip files /as folders/. If you have to, consider it a transport encoding.

The pattern that it implements is fairly simple, and in several models you have to lay things out somewhat manually.  I think some more convention and tool support (e.g., in pip) would be helpful.

+1

Though there are quite a few details, the result is more reliable, stable, and easier to audit than anything based on a build process (which any use of "dependencies" would require -- there are *no* dependencies in a Silver Lining package, only the files that are *part* of the package).

It might be just me (and the other people who seem to enjoy WebCore and Marrow) but it is fully possible to do install-time dependencies in such a way as things won't break accidentally. Also, you missed Application Spec #4.

Some notes from your link:

- There seems to be both the description of a format, and a program based on that format, but it's not entirely clear where the boundary is.  I think it's useful to think in terms of a format and a reference implementation of particular tools that use that format (development management tools, like installing into the format; deployment tools; testing tools; local serving tools; etc).

Indeed; this gist was some really quickly hacked together ideas.

- In Silver Lining I felt no need at all for shared libraries.  Some disk space can be saved with clever management (hard links), but only when it's entirely clear that it's just an optimization.  Adding a concept like "server-packages" adds a lot of operational complexity and room for bugs without any real advantages.

±0

- I try to avoid error conditions in the deployment, which is a big part of not having any build process involved, as build processes are a source of constant errors -- you can do a stage deployment, then five minutes later do a production deployment, and if you have a build process there is a significant chance that the two won't match.

I have never, in my life, encountered that particular problem. I may be more careful than most in defining dependencies with version number boundaries, I may be more careful in utilizing my own package repository (vs. the public PyPi), but I don't think I'm unique in having few to no issues in development/sandbox/production deployment processes.

Hell, I'm still able to successfully deploy a TurboGears 0.9 application without dependency issues.

However, the package format I describe in that gist does include the source for the dependencies as "snapshotted" during bundling. If your application is working in development, after snapshotting it /will/ work on sandbox or production deployments.

        — Alice.


_______________________________________________
Web-SIG mailing list
Web-SIG@python.org
Web SIG: http://www.python.org/sigs/web-sig
Unsubscribe: 
http://mail.python.org/mailman/options/web-sig/archive%40mail-archive.com

Reply via email to