As Andrew did, I also CC'd.. :)
On 12/25/23, 이 강우 <cools...@hotmail.com> wrote: > how to clone apt repository to newest only? > Fedora/Red Hat will organize the repository by copying only the most recent > packages from that distribution if you give it the "reposync --newest-only" > option, but Debian doesn't seem to be able to do that. > > What can I do? Hi.. This is Draft Email #2 for me for this thread. The first email is very long. I chopped off all of the tips and am only focusing on the following questions for now. Am starting this time with an apt query: $ apt-cache search reposync Got a potential hit! The package is called dnf-plugins-core. It looks interesting (to me). Its description is: Description-en: Core plugins for DNF, the Dandified Yum package manager This package enhances DNF with builddep, config-manager, copr, debug, debuginfo-install, download, needs-restarting, groups-manager, repoclosure, repograph, repomanage, reposync, changelog and repodiff commands. It's the only package that references reposync. I'll be downloading and poking at it as a personal Debian development learning adventure by comparing it to wget and rsync as referenced further below. If dnf-plugins-core does not work for some reason, here are some questions that might help Debian Users help you.... What are you actually trying to do? Might also be asked as.. what were you doing in the past? What exact command(s) were you using? Internet searching on "reposync" alone makes it look like you're trying to do what I have found that wget does. It worked for an LS (Linux From Scratch) short webpage of only links today. Wget also worked on a Debian repository related webpage that included child directories. Running "man rsync" references "URL" a few times, too, but I've not been successful with it in the past. This thread is a reminder of that feature so I'll be playing with it again later. It's always good to know more than one way to accomplish all Linux tasks. :) My other questions that will help Debian Users help you are..... Which Debian directory are you asking about? Or is it even tied to debian(dot)org? If you're [pinging] a webpage that is not Debian and it's not too personal, what webpage are you trying to sync? A #1 question I have is... Where are the files you are trying to duplicate (the source files), and where are you duplicating those to (the intended target directory/destination)? Another way to ask that is: Are you duplicating from one personal computer to another, or are you trying to pull from an online Internet server's repository to download onto a personal computer? Or are you maybe even trying to do yet something else that is not mentioned above? Your answer(s) might make a difference in the command and flags you could use. As an example, that massively long Draft Email #1 I wrote earlier included this useful tip I just learned for my own usage today: $ wget -c --recursive --no-parent https://www.linuxfromscratch.org/lfs/downloads/development/ That lead came from StackOverflow: https://stackoverflow.com/questions/273743/using-wget-to-recursively-fetch-a-directory-with-arbitrary-files-in-it Just test drove it, and it did work as hoped. That "--no-parent" is telling wget to focus only on the current directory, e.g. for me the LFS "development" download webpage along with any possible child directories found there. Be aware that there can still be some extra junk come in, depending on what webpage is being tapped. The more text content and less HTML, the better. Wget does work as expected, does keep digging into child directories, too, because I just tested a Debian repository related webpage under /debian/dists. That's all I have for now. Just let us know... An aside to wget and rsync Developers: Thank you for your work! Between the two of your packages, it's a multi-times daily thing going on between us. Cindy :) -- Talking Rock, Pickens County, Georgia, USA * runs with a jingle-jingle *