On Thu, Apr 11, 2024 at 05:52:47PM +0200, Christoph Anton Mitterer wrote: > Package: apt > Version: 2.7.14 > Severity: normal > Tags: security > > > Hey. > > I noted the following behaviour - which may or may not be regarded as > security relevant. > So this is rather a heads up, and in case you think it's fine as it is, > just close it. > > I always remembered that apt-get source was ought to verify hashes of > the downloaded files (i.e. secure APT as signed by the repo). > > Likewise, I thought to remember that at least at one point in time, > downloads of binary packages (via e.g. apt-get download or aptitude > download) were NOT verified. > Because of that I never trusted these which was quite unhandy. > > So I though I'd simply test that (using a local package repo and simply > changing one byte of the files to download), and turns out that apt-get > download DOES also verify the binary packages and exit with non-zero > status of the don't match. > Nice. > > > So just to be sure I did the same with the source package files. > And here I noted some things: > - It does check freshly downloaded files and exit with non-zero in case > their hashes mismatch. > - But it does so as well, with *already* downloaded files, and if they > don't match it silently downloads (also verified) fresh files. > => First, I'm not sure whether this is the right behaviour, as the > "original/modified" file seems to get removed, but it - being a > local file - may actually be something of value to the user. > So maybe it should just move the file to foo.FAILED and error > with non-zero exit status? > > > Then I made some particular tries: > a) On a previously (valid) downloaded source package I modified a byte > in the local .dsc and modified a byte in the remote .orig.tar.xz. > apt again notcies the valid local .orig.tar.xz and does nothing and > notices the invalid local .dsc and re-downloads it (which succeeds > as I haven't mangled the remote .dsc). > > In principle I'd say this is fine, and there's no direct security > issue ... and probably not even an indirect one. > What does however happen - due to the skipped download of the already > present+valid files - is that the remote corruption of he .orig.tar.xz > isn't notice. > > I'd say, not an issue, but nevertheless wanted to give a heads up. > > > b) What may now be the “super minor security issue” is the following: > apt *does* check already downloaded files for validity and exits > with zero if they match, right? > > So conceptuall one could have gone two ways: > - anything local is already trusted because it was verified before > or the user somehow manually brought it to the system and should > know what he's doing > - `apt-get source` acts also like a checker and if the exit status is > one can assume that the files present are valid > > It seems to be the 2nd, given that it verifies the local files. > > It does however NOT again verify any already unpacked tree. > > So in some super obscure scenarios, a user could come to assume that > exit status zero means that all the stuff is verified, while in fact > only the non unpacked files are. > > Again of course, for an attack, there would need to be some way to > introduce a modified unpacked tree, where one could say that if an > attacker can do that, it's anyway already too late. > > But simply from that conceptual PoV, it seems to me as if that > behaviour is unfortunate. > > I do however have no idea for a better behaviour. > Checking would anyway mean that we need to unpack it - therefore > wasting further resources. > And the tree may differ simply because of user modifications, what > then? Move the dir to xx.NON-PRISTINE?
I think I'm fine just exiting 1 if the directory already exists, after doing the download dance. -- debian developer - deb.li/jak | jak-linux.org - free software dev ubuntu core developer i speak de, en