Re: [linux] question about automating download/install of google's "repo" tool
On Mon, 30 Mar 2020, Rick Leir wrote: > Robert > But wait, your vim .. it's out of date. And your compiler .. it's > out of date. > > How frequently does repo get updated? More than vim no doubt. But > your dnf update happens frequently, or is it apt-get. And it checks > for compatibility with other packages. And you are less likely to > stumble at the bleeding edge ( remember openssl and heartbleed? The > older version was safe.) > > But it is more fun to use the latest of everything, I get you there! > Excuse me for arguing here. Thanks for mentioning repo, I should be > using it. just to be clear about repo, the issue here is that that command actually serves two purposes. first, if you're using it to *initialize* a new repository, it acts as the "launcher tool", whose job it is to set up the new repo, then *fetch* the current version of the "repo" command to be used in all subsequent operations. and second, once that repo is initialized (as i said), it's the newer version (now embedded in the repository) that will be used. the point here is that the launcher tool is really a simple piece of software, doesn't need to do much more than fetch the newer, more stable version. so downloading and version controlling the launcher tool version is kind of way overkill, that's all. i think i'm going to write a wiki page on this as others i've chatted with don't really understand how it works. rday To unsubscribe send a blank message to linux+unsubscr...@linux-ottawa.org To get help send a blank message to linux+h...@linux-ottawa.org To visit the archives: https://lists.linux-ottawa.org
Re: [linux] question about automating download/install of google's "repo" tool
Robert But wait, your vim .. it's out of date. And your compiler .. it's out of date. How frequently does repo get updated? More than vim no doubt. But your dnf update happens frequently, or is it apt-get. And it checks for compatibility with other packages. And you are less likely to stumble at the bleeding edge ( remember openssl and heartbleed? The older version was safe.) But it is more fun to use the latest of everything, I get you there! Excuse me for arguing here. Thanks for mentioning repo, I should be using it. Rick On March 30, 2020 1:59:44 PM EDT, "Robert P. J. Day" wrote: > > currently having a discussion with a couple colleagues regarding >automating the use of google's repo tool, and part of that involves >the proposal of downloading and storing (or version controlling) >locally the repo launcher tool itself. > > as people familiar with repo will know, repo effectively comes as >two distinct tools (even if bundled in the same script): > > 1) the "launcher tool", which is run when you initialize a new > repo checkout with "repo init", and > > 2) the "main" repo tool, which is installed in every initialized > repo, and which is the one called to do actual repo operations > on an existing repo > >as explained here: > > https://gerrit.googlesource.com/git-repo > >to prepare to use repo, the easiest strategy is to copy the launcher >tool to, say, your personal bin directory (and make sure it's in your >search path, of course): > >$ mkdir -p ~/.bin >$ curl https://storage.googleapis.com/git-repo-downloads/repo > >~/.bin/repo >$ chmod a+rx ~/.bin/repo > > IOW, every time you run "repo", it will run your personal launcher >tool, and will either: > > 1) create a new repo, and install the main repo command in that >repo, or > > 2) will recognize you already have a repo, and will invoke the main >"repo" command that is there > > now, in aid of automating as much of this as possible, one of the >proposals is to, ahead of time, download the launcher tool, store it >locally (possibly version controlled), whereupon the automation script >will, for new repos, check out that stored version and take it from >there. > > personally, i prefer the simpler approach of just running these >commands every single time at the top of the automation script to make >sure every developer has their own copy of the launcher tool: > >$ mkdir ~/.bin >$ curl https://storage.googleapis.com/git-repo-downloads/repo > >~/.bin/repo >$ chmod a+rx ~/.bin/repo > >my attitude is, so what if it downloads a 1000-line script each time? >doing it this way guarantees that one will always have the latest >version from google. i think trying to avoid this by cleverly storing >a local copy is way overkill, and introduces the possibility of a >local copy getting out of date. > > anyway, thoughts? > >rday > >To unsubscribe send a blank message to >linux+unsubscr...@linux-ottawa.org >To get help send a blank message to linux+h...@linux-ottawa.org >To visit the archives: https://lists.linux-ottawa.org -- Sorry for being brief. Alternate email is rickleir at yahoo dot com
[linux] question about automating download/install of google's "repo" tool
currently having a discussion with a couple colleagues regarding automating the use of google's repo tool, and part of that involves the proposal of downloading and storing (or version controlling) locally the repo launcher tool itself. as people familiar with repo will know, repo effectively comes as two distinct tools (even if bundled in the same script): 1) the "launcher tool", which is run when you initialize a new repo checkout with "repo init", and 2) the "main" repo tool, which is installed in every initialized repo, and which is the one called to do actual repo operations on an existing repo as explained here: https://gerrit.googlesource.com/git-repo to prepare to use repo, the easiest strategy is to copy the launcher tool to, say, your personal bin directory (and make sure it's in your search path, of course): $ mkdir -p ~/.bin $ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo $ chmod a+rx ~/.bin/repo IOW, every time you run "repo", it will run your personal launcher tool, and will either: 1) create a new repo, and install the main repo command in that repo, or 2) will recognize you already have a repo, and will invoke the main "repo" command that is there now, in aid of automating as much of this as possible, one of the proposals is to, ahead of time, download the launcher tool, store it locally (possibly version controlled), whereupon the automation script will, for new repos, check out that stored version and take it from there. personally, i prefer the simpler approach of just running these commands every single time at the top of the automation script to make sure every developer has their own copy of the launcher tool: $ mkdir ~/.bin $ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo $ chmod a+rx ~/.bin/repo my attitude is, so what if it downloads a 1000-line script each time? doing it this way guarantees that one will always have the latest version from google. i think trying to avoid this by cleverly storing a local copy is way overkill, and introduces the possibility of a local copy getting out of date. anyway, thoughts? rday To unsubscribe send a blank message to linux+unsubscr...@linux-ottawa.org To get help send a blank message to linux+h...@linux-ottawa.org To visit the archives: https://lists.linux-ottawa.org