Control: severity -1 wishlist Control: tags -1 + moreinfo
Hi Tim, 2006-05-28 15:00 Tim Connors:
Package: aptitude Version: 0.4.1-1 Severity: important I read that apt-get is to be deprecated in favour of aptitude, but unfortunately, if that is to happen, aptitude *must* be made much more memory lean. Debian is meant to be able to installed on a small machine with 32MB of RAM, yet just performing an aptitude upgrade on a machine with 256MB of ram was taking an inordinate amount of time just to read the database. The reason being is that it wants to use: 7800 root 18 0 184m 103m 97m D 3.3 41.3 0:26.73 aptitude during in particular the "Building tag database" phase (although it was hogging memory what I would call "excessively" before then too. That is seriously unusable. If I want to add a single package to the system using aptitude, I can expect to wait half an hour for it to install on a machine that is only a few years old -- 256MB is *not* a small amount of RAM. Compare that to the same phase (asking me whether I want to continue) in apt-get: 9239 root 17 0 171m 17m 15m S 0.0 6.8 0:02.91 apt-get Same virtual ram, but it's not actually doing anything with what's mapped, so it doesn't get read in. Much more acceptable, no? The number of packages I have installed (it is a desktop system) appears to be 1580, and /var/lib/dpkg/info disk usage is 47MB (running unstable). Anything else of relevance?
(reply to this and other messages in the bug in the same line) Today, with stable + unstable + experimental *and* multi-arch enabled, starting aptitude with the curses mode uses a similar amount of memory as the one reported 9 years ago, 182 MB. In command line mode, simple commands use similar amounts at the peak. ~10 processes, including Xorg, a few chromium processes, konqueror and iceweasel (the latter with more than 1.5 GB itself alone) outrank aptitude as memory hogs by far. In principle, most of that memory used (specially peak usage reading and parsing the data) is something that the apt library needs to do anyway, we have limited influence on that, and using lots of repositories at the same time doesn't help. There have been optimisations during the years, like moving the descriptions to other files (for the Debian repositories at least), but there is a limit to what can be done when the stable release already has ~40k or more binary packages, and aptitude cannot do much if many repositories are added -- even if some bits are the same across versions, long fields like dependencies, checksums or descriptions are almost guaranteed to be different. Additionally, aptitude in curses mode, needs to have all of this information in memory for example to use in the default view, or else read package information multiple times and free it as soon as the user is not seeing some bits -- which maybe it's a good tradeoff to consider, but in old machines is not going to make users very happy either (the machines reported were already not-the-latest by the time that it was reported). It is curious that all of these report happened within a few months of each other in 2006, and no "secondings" since. Maybe there was a spike at the time, coupled with the fact that there were machines with very few RAM still in use by then, that make this specially noticeable at that time. Nowadays, desktop machines with browsers and other applications, servers with JVM and others, and even small development boards (trailing behind a bit, but costing very small amounts of money), ship with much higher amounts than 256 GB, and less than that is not useful for many purposes (other than extremely minimalistic servers or router-like). Improving the performance itself has quite a lot of value, for example improving start-up times to take fractions of a second. But I think that things like not using debtags when in command line mode will mostly benefit start-up times, not memory usage issues. So, having all of this into account, it doesn't seem to me that the bug can be classified as "important" by now, and seeing some of the bigger problems that aptitude accumulated during years of decay, this doesn't look either high priority or trivial enough to merit spending the time needed to address this in the near future (at which point, memory usage will probably be even less relevant). So I am leaving this open for now for further consideration, but I don't think that this is going to be addressed soon. Cheers. -- Manuel A. Fernandez Montecelo <manuel.montez...@gmail.com>