On Mon, 2020-06-22 at 19:03 +0200, Miguel Ojeda wrote:
> On Mon, Jun 22, 2020 at 3:06 PM Jonathan Corbet <cor...@lwn.net> wrote:
> > As has been noted elsewhere, checkpatch.pl seems like the appropriate
> > place to make this check.  As for "the entire tree"...if this job gets
> > completed, "git grep" should be a fine way to do that.
> 
> `checkpatch` is not really enforced in many subsystems, no? Further,
> some existing and future HTTP links may support HTTPS later on.
> 
> As for `git grep`, agreed if we reach near 100%. Otherwise, no. In the
> general case, including the code for a task that has some likelihood
> of needing repetition is a safe bet, which is why I suggested it. The
> same script could be also used to check for broken links and related
> maintenance.

scripts/get_maintainer.pl --self-test=links has a reachability test
using wget.

Perhaps a script like that could be used for http:// vs https://

        ## Link reachability
        } elsif (($type eq "W" || $type eq "Q" || $type eq "B") &&
                 $value =~ /^https?:/ &&
                 ($self_test eq "" || $self_test =~ /\blinks\b/)) {
            next if (grep(m@^\Q$value\E$@, @good_links));
            my $isbad = 0;
            if (grep(m@^\Q$value\E$@, @bad_links)) {
                $isbad = 1;
            } else {
                my $output = `wget --spider -q --no-check-certificate --timeout 
10 --tries 1 $value`;
                if ($? == 0) {
                    push(@good_links, $value);
                } else {
                    push(@bad_links, $value);
                    $isbad = 1;
                }
            }
            if ($isbad) {
                print("$x->{file}:$x->{linenr}: warning: possible bad 
link\t$x->{line}\n");
            }


Reply via email to