[ https://issues.apache.org/jira/browse/SPARK-26554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-26554. ----------------------------------- Resolution: Fixed Assignee: Dongjoon Hyun Fix Version/s: 3.0.0 2.4.1 This is resolved via https://github.com/apache/spark/pull/23476 > Update `release-util.sh` to avoid GitBox fake 200 headers > --------------------------------------------------------- > > Key: SPARK-26554 > URL: https://issues.apache.org/jira/browse/SPARK-26554 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 2.4.1, 3.0.0 > Reporter: Dongjoon Hyun > Assignee: Dongjoon Hyun > Priority: Blocker > Fix For: 2.4.1, 3.0.0 > > > Unlike the previous Apache Git repository, new GitBox returns a fake HTTP 200 > header instead of `404 Not Found` header. This makes release scripts out of > order. This issue aims to fix it to handle the context message instead of the > fake HTTP headers. This is a release blocker. > {code} > $ curl -s --head --fail > "https://gitbox.apache.org/repos/asf?p=spark.git;a=commit;h=v3.0.0" > HTTP/1.1 200 OK > Date: Sun, 06 Jan 2019 22:42:39 GMT > Server: Apache/2.4.18 (Ubuntu) > Vary: Accept-Encoding > Access-Control-Allow-Origin: * > Access-Control-Allow-Methods: POST, GET, OPTIONS > Access-Control-Allow-Headers: X-PINGOTHER > Access-Control-Max-Age: 1728000 > Content-Type: text/html; charset=utf-8 > {code} > *BEFORE* > {code} > $ ./do-release-docker.sh -d /tmp/test -n > Branch [branch-2.4]: > Current branch version is 2.4.1-SNAPSHOT. > Release [2.4.1]: > RC # [1]: > v2.4.1-rc1 already exists. Continue anyway [y/n]? > {code} > *AFTER* > {code} > $ ./do-release-docker.sh -d /tmp/test -n > Branch [branch-2.4]: > Current branch version is 2.4.1-SNAPSHOT. > Release [2.4.1]: > RC # [1]: > This is a dry run. Please confirm the ref that will be built for testing. > Ref [v2.4.1-rc1]: > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org