In addition, FYI, I was the latest release manager with Apache Spark 3.4.3 (2024-04-15 Vote)
According to my work log, I uploaded the following binaries to SVN from EC2 (us-west-2) without any issues. -rw-r--r--. 1 centos centos 311384003 Apr 15 01:29 pyspark-3.4.3.tar.gz -rw-r--r--. 1 centos centos 397870995 Apr 15 00:44 spark-3.4.3-bin-hadoop3-scala2.13.tgz -rw-r--r--. 1 centos centos 388930980 Apr 15 01:29 spark-3.4.3-bin-hadoop3.tgz -rw-r--r--. 1 centos centos 300786123 Apr 15 01:04 spark-3.4.3-bin-without-hadoop.tgz -rw-r--r--. 1 centos centos 32219044 Apr 15 00:23 spark-3.4.3.tgz -rw-r--r--. 1 centos centos 356749 Apr 15 01:29 SparkR_3.4.3.tar.gz Since Apache Spark 4.0.0-preview doesn't have Scala 2.12 combination, the total size should be smaller than 3.4.3 binaires. Given that, if there is any INFRA change, that could happen after 4/15. Dongjoon. On Thu, May 9, 2024 at 7:57 AM Dongjoon Hyun <dongjoon.h...@gmail.com> wrote: > Could you file an INFRA JIRA issue with the error message and context > first, Wenchen? > > As you know, if we see something, we had better file a JIRA issue because > it could be not only an Apache Spark project issue but also all ASF project > issues. > > Dongjoon. > > > On Thu, May 9, 2024 at 12:28 AM Wenchen Fan <cloud0...@gmail.com> wrote: > >> UPDATE: >> >> After resolving a few issues in the release scripts, I can finally build >> the release packages. However, I can't upload them to the staging SVN repo >> due to a transmitting error, and it seems like a limitation from the server >> side. I tried it on both my local laptop and remote AWS instance, but >> neither works. These package binaries are like 300-400 MBs, and we just did >> a release last month. Not sure if this is a new limitation due to cost >> saving. >> >> While I'm looking for help to get unblocked, I'm wondering if we can >> upload release packages to a public git repo instead, under the Apache >> account? >> >>> >>>>>>>>>>>