Because I have a large number of boxen to support behind a firewall, I find it preferable to maintain my own yum repository behind the firewall and do a wget from the Internet master repo to the local repo then run local updates from the local repo, hence saving bandwidth.

Now this all works fine and if wget finds an update out there that it doesn't have then it downloads it, but if it already has it (I assume HTTP date changed logic is happening here) then it goes onto the next file.

My problem is that my local repo is full of old updates that I would like to cull, but the naming conventions on the various files do not appear to be consistent, thus making auto cull difficult, eg>

this-file-1.2.9.i386.rpm is obvious to the human eye earlier than this-file-1.2.10.i386.rpm as well as having a later create date, but, sortwise, it orders differently (yes, I know that sort has the -n option but the variance is not always numeric).

The added problem is that the change detail in the file name is not in a consistent position either, so trying awk or cut or anything else doesn't get the right detail.

Has anyone resolved this problem on their own networks?

--
Howard.
LANNet Computing Associates - Your Linux people <http://lannet.com.au>
--
When you just want a system that works, you choose Linux;
When you want a system that works, just, you choose Microsoft.
--
Flatter government, not fatter government;
Get rid of the Australian states.

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to