On Mon, Jun 10, 2013 at 11:05 AM, Jürgen Herrmann <shadow...@me.com> wrote:
> Hmm... > > First of all we should assume that Linux user won't usually run blender as > root or in a sudo environment. So a "rm / -rf" attack shouldn't work by > default. > Not from a full OS attack perspective, but it should would wipe out any file/directory writable by the running user. For personal blender users, this would just make them hate blender and wish they did backups.. but for users in shared environments (maybe with mounted drives that are group writable) a single blender user might wipe out significant parts of the work of their entire team. Having it download and install botnet clients would be more likely (given that it seems to be the fad now a days). > We should start small and look at the PGP signing possibility first. This could also incorporate a > third party signing process. For instance some signing authority sort of thing. Something like adding a "SIG1" data block to the [near] end of the .blend file that signs all data before it (any blocks following it, except END, would either be ignored or cause the signature to be invalid). It would be ignored by older versions of blender, like any other unknown block, so backward compatible. Possibly something based on a sub-set of S/MIME [RFC3851] or OpenPGP [RFC4880] (so an existing standard can be followed). The biggest [legal] problem might be the exportability of blender with any bundled crypto libs (you know those pesky governments/regulations). While not as clean, having it hand off the data and signature parts to an external program, if available (that can be downloaded and installed along side blender, but keep blender itself untainted), is possible. Can PGP [the implementing application(s)] be used this way, including blender specific/driven key management? And would CA's be needed or are just entity public keys (I get the impression PGP doesn't use CA's) enough? Also, assuming a self-key is automatically generated by blender for each user (so they can implicitly sign anything they've created), settings whether to auto-sign (and which key to use), whether to auto-trust their own or not (i.e. they open something untrusted, make a small change, and re-save it with their signature), and options to export/import their (or really any) key(s) from one blender install to another would need to be accounted for. Oh, and if the original comment just meant to create a second signature file (e.g. .blend.sig), that might be simple for a "quick implementation", but then forces users to always upload/download two files (when signed). Also if you rename/move the .blend, you have to do the same for any signature file. Hassles that might cause many to opt out (i.e. just ignore the .sig) which puts things back where they started (only with users complaining more when legit scripted blender files don't work right because the validation was skipped). While a Blender Foundation public key could be bundled with blender easily.. how would any .blend or addons (if signing for them is done [at some point]) be signed? Since not just any developer would have access to the BF private signature key, presumably only a _very_ few would do any "official" signing.. but if signing had to be done somewhere between compiling and building the distro file, how would this work for each developer that produces official build for their platform(s)? If per-addon signature data was just kept in static file (e.g. manifest.sig) for each file in the addon (think signed jars style), then I suppose the "holder of the keys" would just do a batch signing of all addons just before a release (for any out of date at least). Speaking of that.. it might be nice to have a standalone cmdline program for batch .blend signing. Also.. in any case, any content being validated _must_be the same data read by blender itself. Having another program re-read the "same" blender file directly to validate is not secure (such race conditions are simple enough to write attacks for). And while it seems like anything that can manipulate files to exploit such race conditions _already_ has compromised that system, it could be attempting to exploit a different user or even different machine (using a shared filesystem with an infected machine). So.. there's some thoughts and warnings if signing was to get added. Everyone can wake up now after reading all that boring, technical, mumbo jumbo. -Chad _______________________________________________ Bf-committers mailing list Bf-committers@blender.org http://lists.blender.org/mailman/listinfo/bf-committers