Robert Atkey wrote:
Links itself, running in another process, wouldn't self-interfere, as long as it followed the policy of hashing the complete pathname. Other software shouldn't really touch the same files, since the filenames would be "unguessable": they'd have (say) a prefix unique to Links, or even unique to the Links version. Also, each user (each real user, as opposed to Unix user) could point Links at their own caching directory, if they were worried about collisions.

I'm not sure your scheme would make it unguessable:

I meant "unguessable" in the sense that it would have some "random" string in it, thus making it hard to hit without knowing what it was; "unguessable" is probably a bad word for this.

I see what you mean about it being easy to write a CGI program that does all the peeking and poking for you under apache's cover. I'm not really bothered about protecting against this.

OTOH there are probably more security problems with a group of mutually
untrusting users all sharing a single UNIX user than overwriting each
others' cache files.

Yes.

Maybe another solution is to have the user do the caching by hand: a
command-line Links compiler that outputs the cached versions and have
the user give that to the webserver rather than the source code? They
would then just have to make sure the compiled version was not
world/webserver-writable.
I thought of this too, and I like it. A downside is that it adds a step to the edit-run cycle, so I'd like to ensure there's a way to skip that.

Thanks everyone,
Ezra



--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.

_______________________________________________
links-users mailing list
[email protected]
http://lists.inf.ed.ac.uk/mailman/listinfo/links-users

Reply via email to