On Thu, 26 Jan 2012 20:54:03 +0100, [email protected] (=?utf-8?Q?Micha=C5=82_Mas=C5=82owski?=) wrote: > > We could implement both options: Checking while building AND checking > > while updating repos. > > How will we check while building? I don't see an effective way to do > this without knowing which libraries were in the previous version of the > package. > > > What changes would be needed to implement the algorithm on Issue224? > > It's not clear from the algorithm's description how to implement it, > I'll think about it after the weekend. > > I could try implementing a script to check a single package, using a > database shared between multiple runs. Would using Python and SQLite be > a problem for this? > > Checking while updating would be done by running the script on each new > upload and sending its output to a list. I don't know exactly which > scripts do this, maybe db-update or something used in it. > > Checking while building could use the script with maybe a different > database.
Obtaining file lists from packages is trivial, just bsdtar tf all packages... and we should do this anyway to have $repo.file.tar.gz databases. But IIRC those aren't easily parsable (too many files). You can also temporarily extract files under binary directories and make a database of linked libraries, then compare. When something's missing, a binary just broke. Example: mplayer-libre links to x264: $ ldd mplayer /usr/lib/libx264.so.8 ^ store that Look for /usr/lib/libx264.so.8 on file database. If missing, mplayer is broken. This can be done in a relational database fashion to have immediate results instead of individual checking. There's an Arch script that checks repositories health. Maybe it does this already? It's on dbscripts.git, if you're subscribe to arch-dev-public you'll see the reports.
pgpnM8gqIvaPa.pgp
Description: PGP signature
_______________________________________________ Dev mailing list [email protected] http://lists.parabolagnulinux.org/mailman/listinfo/dev
