This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6a5649410d83 [SPARK-47098][INFRA] Migrate from AppVeyor to GitHub 
Actions for SparkR tests on Windows
6a5649410d83 is described below

commit 6a5649410d83610777bd3d67c7a6f567215118ae
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Tue Feb 20 08:02:30 2024 -0800

    [SPARK-47098][INFRA] Migrate from AppVeyor to GitHub Actions for SparkR 
tests on Windows
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to migrate from AppVeyor to GitHub Actions for SparkR 
tests on Windows.
    
    ### Why are the changes needed?
    
    Reduce the tools we use for better maintenance.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, dev-only.
    
    ### How was this patch tested?
    
    - [x] Tested in my fork
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #45175 from HyukjinKwon/SPARK-47098.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .github/labeler.yml                       |   1 -
 .github/workflows/build_sparkr_window.yml |  81 +++++++++++++
 README.md                                 |   1 -
 appveyor.yml                              |  75 ------------
 dev/appveyor-guide.md                     | 186 ------------------------------
 dev/appveyor-install-dependencies.ps1     | 153 ------------------------
 dev/sparktestsupport/utils.py             |   7 +-
 project/build.properties                  |   1 -
 8 files changed, 83 insertions(+), 422 deletions(-)

diff --git a/.github/labeler.yml b/.github/labeler.yml
index 20b5c936941c..7d24390f2968 100644
--- a/.github/labeler.yml
+++ b/.github/labeler.yml
@@ -21,7 +21,6 @@ INFRA:
   - changed-files:
     - any-glob-to-any-file: [
      '.github/**/*',
-     'appveyor.yml',
      'tools/**/*',
      'dev/create-release/**/*',
      '.asf.yaml',
diff --git a/.github/workflows/build_sparkr_window.yml 
b/.github/workflows/build_sparkr_window.yml
new file mode 100644
index 000000000000..07f4ebe91ad2
--- /dev/null
+++ b/.github/workflows/build_sparkr_window.yml
@@ -0,0 +1,81 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+name: "Build / SparkR-only (master, 4.3.2, windows-2019)"
+
+on:
+  schedule:
+    - cron: '0 17 * * *'
+
+jobs:
+  build:
+    name: "Build module: sparkr"
+    runs-on: windows-2019
+    timeout-minutes: 300
+    steps:
+    - name: Download winutils Hadoop binary
+      uses: actions/checkout@v4
+      with:
+        repository: cdarlint/winutils
+    - name: Move Hadoop winutil into home directory
+      run: |
+        Move-Item -Path hadoop-3.3.5 -Destination ~\
+    - name: Checkout Spark repository
+      uses: actions/checkout@v4
+    - name: Cache Maven local repository
+      uses: actions/cache@v4
+      with:
+        path: ~/.m2/repository
+        key: build-sparkr-maven-${{ hashFiles('**/pom.xml') }}
+        restore-keys: |
+          build-sparkr-windows-maven-
+    - name: Install Java 17
+      uses: actions/setup-java@v4
+      with:
+        distribution: zulu
+        java-version: 17
+    - name: Install R 4.3.2
+      uses: r-lib/actions/setup-r@v2
+      with:
+        r-version: 4.3.2
+    - name: Install R dependencies
+      run: |
+        Rscript -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 
'e1071', 'survival', 'arrow', 'xml2'), repos='https://cloud.r-project.org/')"
+        Rscript -e "pkg_list <- as.data.frame(installed.packages()[,c(1, 
3:4)]); pkg_list[is.na(pkg_list$Priority), 1:2, drop = FALSE]"
+      shell: cmd
+    - name: Build Spark
+      run: |
+        rem 1. '-Djna.nosys=true' is required to avoid kernel32.dll load 
failure.
+        rem   See SPARK-28759.
+        rem 2. Ideally we should check the tests related to Hive in SparkR as 
well (SPARK-31745).
+        rem 3. setup-java installs Maven 3.8.7 but does not allow changing its 
version, so overwrite
+        rem   Maven version as a workaround.
+        mvn -DskipTests -Psparkr -Djna.nosys=true package -Dmaven.version=3.8.7
+      shell: cmd
+    - name: Run SparkR tests
+      run: |
+        set HADOOP_HOME=%USERPROFILE%\hadoop-3.3.5
+        set PATH=%HADOOP_HOME%\bin;%PATH%
+        .\bin\spark-submit2.cmd --driver-java-options 
"-Dlog4j.configuration=file:///%CD:\=/%/R/log4j2.properties" --conf 
spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
+      shell: cmd
+      env:
+        NOT_CRAN: true
+        # See SPARK-27848. Currently installing some dependent packages causes
+        # "(converted from warning) unable to identify current timezone 'C':" 
for an unknown reason.
+        # This environment variable works around to test SparkR against a 
higher version.
+        R_REMOTES_NO_ERRORS_FROM_WARNINGS: true
diff --git a/README.md b/README.md
index ec29a1c49b41..b9a20075f6a1 100644
--- a/README.md
+++ b/README.md
@@ -10,7 +10,6 @@ and Structured Streaming for stream processing.
 <https://spark.apache.org/>
 
 [![GitHub Actions 
Build](https://github.com/apache/spark/actions/workflows/build_main.yml/badge.svg)](https://github.com/apache/spark/actions/workflows/build_main.yml)
-[![AppVeyor 
Build](https://img.shields.io/appveyor/ci/ApacheSoftwareFoundation/spark/master.svg?style=plastic&logo=appveyor)](https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark)
 [![PySpark 
Coverage](https://codecov.io/gh/apache/spark/branch/master/graph/badge.svg)](https://codecov.io/gh/apache/spark)
 [![PyPI 
Downloads](https://static.pepy.tech/personalized-badge/pyspark?period=month&units=international_system&left_color=black&right_color=orange&left_text=PyPI%20downloads)](https://pypi.org/project/pyspark/)
 
diff --git a/appveyor.yml b/appveyor.yml
deleted file mode 100644
index 762e4cf55b9f..000000000000
--- a/appveyor.yml
+++ /dev/null
@@ -1,75 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-version: "{build}-{branch}"
-
-shallow_clone: true
-
-platform: x64
-configuration: Debug
-
-branches:
-  only:
-    - master
-
-only_commits:
-  files:
-    - appveyor.yml
-    - dev/appveyor-install-dependencies.ps1
-    - build/spark-build-info.ps1
-    - R/
-    - sql/core/src/main/scala/org/apache/spark/sql/api/r/
-    - core/src/main/scala/org/apache/spark/api/r/
-    - mllib/src/main/scala/org/apache/spark/ml/r/
-    - core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
-    - bin/*.cmd
-
-cache:
-  - C:\Users\appveyor\.m2
-
-install:
-  # Install SBT and dependencies
-  - ps: .\dev\appveyor-install-dependencies.ps1
-  # Required package for R unit tests. xml2 is required to use jUnit reporter 
in testthat.
-  - cmd: Rscript -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 
'e1071', 'survival', 'arrow', 'xml2'), repos='https://cloud.r-project.org/')"
-  - cmd: Rscript -e "pkg_list <- as.data.frame(installed.packages()[,c(1, 
3:4)]); pkg_list[is.na(pkg_list$Priority), 1:2, drop = FALSE]"
-
-build_script:
-  # '-Djna.nosys=true' is required to avoid kernel32.dll load failure.
-  # See SPARK-28759.
-  # Ideally we should check the tests related to Hive in SparkR as well 
(SPARK-31745).
-  - cmd: set SBT_MAVEN_PROFILES=-Psparkr
-  - cmd: set SBT_OPTS=-Djna.nosys=true -Dfile.encoding=UTF-8 
-XX:ReservedCodeCacheSize=128m
-  - cmd: set JAVA_OPTS=-Xms4096m -Xms4096m
-  - cmd: sbt package
-  - cmd: set SBT_MAVEN_PROFILES=
-  - cmd: set SBT_OPTS=
-  - cmd: set JAVA_OPTS=
-
-environment:
-  NOT_CRAN: true
-  # See SPARK-27848. Currently installing some dependent packages causes
-  # "(converted from warning) unable to identify current timezone 'C':" for an 
unknown reason.
-  # This environment variable works around to test SparkR against a higher 
version.
-  R_REMOTES_NO_ERRORS_FROM_WARNINGS: true
-
-test_script:
-  - cmd: .\bin\spark-submit2.cmd --driver-java-options 
"-Dlog4j.configuration=file:///%CD:\=/%/R/log4j2.properties" --conf 
spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
-
-notifications:
-  - provider: Email
-    on_build_success: false
-    on_build_failure: false
-    on_build_status_changed: false
diff --git a/dev/appveyor-guide.md b/dev/appveyor-guide.md
deleted file mode 100644
index c68b5de9e61d..000000000000
--- a/dev/appveyor-guide.md
+++ /dev/null
@@ -1,186 +0,0 @@
----
-license: |
-  Licensed to the Apache Software Foundation (ASF) under one or more
-  contributor license agreements.  See the NOTICE file distributed with
-  this work for additional information regarding copyright ownership.
-  The ASF licenses this file to You under the Apache License, Version 2.0
-  (the "License"); you may not use this file except in compliance with
-  the License.  You may obtain a copy of the License at
- 
-     http://www.apache.org/licenses/LICENSE-2.0
- 
-  Unless required by applicable law or agreed to in writing, software
-  distributed under the License is distributed on an "AS IS" BASIS,
-  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-  See the License for the specific language governing permissions and
-  limitations under the License.
----
-
-# AppVeyor Guides
-
-Currently, SparkR on Windows is being tested with 
[AppVeyor](https://ci.appveyor.com). This page describes how to set up AppVeyor 
with Spark, how to run the build, check the status and stop the build via this 
tool. There is the documentation for AppVeyor 
[here](https://www.appveyor.com/docs). Please refer this for full details.
-
-
-### Setting up AppVeyor
-
-#### Sign up AppVeyor.
-
-- Go to https://ci.appveyor.com, and then click "SIGN UP FOR FREE".
-    
-  <img width="196" alt="2016-09-04 11 07 48" 
src="https://cloud.githubusercontent.com/assets/6477701/18228809/2c923aa4-7299-11e6-91b4-f39eff5727ba.png";>
-
-- As Apache Spark is one of open source projects, click "FREE - for 
open-source projects".
-    
-  <img width="379" alt="2016-09-04 11 07 58" 
src="https://cloud.githubusercontent.com/assets/6477701/18228810/2f674e5e-7299-11e6-929d-5c2dff269ddc.png";>
-
-- Click "GitHub".
-
-  <img width="360" alt="2016-09-04 11 08 10" 
src="https://cloud.githubusercontent.com/assets/6477701/18228811/344263a0-7299-11e6-90b7-9b1c7b6b8b01.png";>
-
-
-#### After signing up, go to profile to link GitHub and AppVeyor.
-
-- Click your account and then click "Profile".
-
-  <img width="204" alt="2016-09-04 11 09 43" 
src="https://cloud.githubusercontent.com/assets/6477701/18228803/12a4b810-7299-11e6-9140-5cfc277297b1.png";>
-
-- Enable the link with GitHub via clicking "Link GitHub account".
-
-  <img width="256" alt="2016-09-04 11 09 52" 
src="https://cloud.githubusercontent.com/assets/6477701/18228808/23861584-7299-11e6-9352-640a9c747c83.png";>
-
-- Click "Authorize application" in GitHub site.
-
-<img width="491" alt="2016-09-04 11 10 05" 
src="https://cloud.githubusercontent.com/assets/6477701/18228814/5cc239e0-7299-11e6-8aeb-71305e22d930.png";>
-
-
-#### Add a project, Spark to enable the builds.
-
-- Go to the PROJECTS menu.
-
-  <img width="97" alt="2016-08-30 12 16 31" 
src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png";>
-
-- Click "NEW PROJECT" to add Spark.
-  
-  <img width="144" alt="2016-08-30 12 16 35" 
src="https://cloud.githubusercontent.com/assets/6477701/18075026/3ee57bc6-6eac-11e6-826e-5dd09aeb0e7c.png";>
-
-- Since we will use GitHub here, click the "GITHUB" button and then click 
"Authorize GitHub" so that AppVeyor can access the GitHub logs (e.g. commits).
-    
-  <img width="517" alt="2016-09-04 11 10 22" 
src="https://cloud.githubusercontent.com/assets/6477701/18228819/9a4d5722-7299-11e6-900c-c5ff6b0450b1.png";>
-
-- Click "Authorize application" from GitHub (the above step will pop up this 
page).
-
-  <img width="484" alt="2016-09-04 11 10 27" 
src="https://cloud.githubusercontent.com/assets/6477701/18228820/a7cfce02-7299-11e6-8ec0-1dd7807eecb7.png";>
-
-- Come back to https://ci.appveyor.com/projects/new and then adds "spark".
-
-  <img width="738" alt="2016-09-04 11 10 36" 
src="https://cloud.githubusercontent.com/assets/6477701/18228821/b4b35918-7299-11e6-968d-233f18bc2cc7.png";>
-
-
-#### Check if any event supposed to run the build actually triggers the build. 
-
-- Click "PROJECTS" menu.
-
-  <img width="97" alt="2016-08-30 12 16 31" 
src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png";>
-
-- Click Spark project.
-
-  <img width="707" alt="2016-09-04 11 22 37" 
src="https://cloud.githubusercontent.com/assets/6477701/18228828/5174cad4-729a-11e6-8737-bb7b9e0703c8.png";>
-
-
-### Checking the status, restarting and stopping the build 
-
-- Click "PROJECTS" menu.
-
-  <img width="97" alt="2016-08-30 12 16 31" 
src="https://cloud.githubusercontent.com/assets/6477701/18075017/2e572ffc-6eac-11e6-8e72-1531c81717a0.png";>
-
-- Locate "spark" and click it.
-
-  <img width="707" alt="2016-09-04 11 22 37" 
src="https://cloud.githubusercontent.com/assets/6477701/18228828/5174cad4-729a-11e6-8737-bb7b9e0703c8.png";>
-
-- Here, we can check the status of current build. Also, "HISTORY" shows the 
past build history.
-
-  <img width="709" alt="2016-09-04 11 23 24" 
src="https://cloud.githubusercontent.com/assets/6477701/18228825/01b4763e-729a-11e6-8486-1429a88d2bdd.png";>
-
-- If the build is stopped, "RE-BUILD COMMIT" button appears. Click this button 
to restart the build.
-
-  <img width="176" alt="2016-08-30 12 29 41" 
src="https://cloud.githubusercontent.com/assets/6477701/18075336/de618b52-6eae-11e6-8f01-e4ce48963087.png";>
-
-- If the build is running, "CANCEL BUILD" button appears. Click this button to 
cancel the current build.
-
-  <img width="158" alt="2016-08-30 1 11 13" 
src="https://cloud.githubusercontent.com/assets/6477701/18075806/4de68564-6eb3-11e6-855b-ee22918767f9.png";>
-
-
-### Specifying the branch for building and setting the build schedule
-
-Note: It seems the configurations in UI and `appveyor.yml` are  mutually 
exclusive according to the 
[documentation](https://www.appveyor.com/docs/build-configuration/#configuring-build).
-
-
-- Click the settings button on the right.
-
-  <img width="1010" alt="2016-08-30 1 19 12" 
src="https://cloud.githubusercontent.com/assets/6477701/18075954/65d1aefa-6eb4-11e6-9a45-b9a9295f5085.png";>
-
-- Set the default branch to build as above.
-
-  <img width="422" alt="2016-08-30 12 42 25" 
src="https://cloud.githubusercontent.com/assets/6477701/18075416/8fac36c8-6eaf-11e6-9262-797a2a66fec4.png";>
-
-- Specify the branch in order to exclude the builds in other branches.
-
-  <img width="358" alt="2016-08-30 12 42 33" 
src="https://cloud.githubusercontent.com/assets/6477701/18075421/97b17734-6eaf-11e6-8b19-bc1dca840c96.png";>
-
-- Set the Crontab expression to regularly start the build. AppVeyor uses 
Crontab expression, 
[atifaziz/NCrontab](https://github.com/atifaziz/NCrontab/wiki/Crontab-Expression).
 Please refer the examples 
[here](https://github.com/atifaziz/NCrontab/wiki/Crontab-Examples).
-
-
-  <img width="471" alt="2016-08-30 12 42 43" 
src="https://cloud.githubusercontent.com/assets/6477701/18075450/d4ef256a-6eaf-11e6-8e41-74e38dac8ca0.png";>
-
-
-### Filtering commits and Pull Requests
-
-Currently, AppVeyor is only used for SparkR. So, the build is only triggered 
when R codes are changed.
-
-This is specified in `.appveyor.yml` as below:
-
-```
-only_commits:
-  files:
-    - R/
-```
-
-Please refer https://www.appveyor.com/docs/how-to/filtering-commits for more 
details.
-
-
-### Checking the full log of the build
-
-Currently, the console in AppVeyor does not print full details. This can be 
manually checked. For example, AppVeyor shows the failed tests as below in 
console
-
-```
-Failed 
-------------------------------------------------------------------------
-1. Error: union on two RDDs (@test_binary_function.R#38) 
-----------------------
-1: textFile(sc, fileName) at 
C:/projects/spark/R/lib/SparkR/tests/testthat/test_binary_function.R:38
-2: callJMethod(sc, "textFile", path, getMinPartitions(sc, minPartitions))
-3: invokeJava(isStatic = FALSE, objId$id, methodName, ...)
-4: stop(readString(conn))
-```
-
-After downloading the log by clicking the log button as below:
-
-![2016-09-08 11 37 
17](https://cloud.githubusercontent.com/assets/6477701/18335227/b07d0782-75b8-11e6-94da-1b88cd2a2402.png)
-
-the details can be checked as below (e.g. exceptions)
-
-```
-Failed 
-------------------------------------------------------------------------
-1. Error: spark.lda with text input (@test_mllib.R#655) 
------------------------
- org.apache.spark.sql.AnalysisException: Path does not exist: 
file:/C:/projects/spark/R/lib/SparkR/tests/testthat/data/mllib/sample_lda_data.txt;
-    at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:376)
-    at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:365)
-    at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
-    at 
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
-    ...
-
- 1: read.text("data/mllib/sample_lda_data.txt") at 
C:/projects/spark/R/lib/SparkR/tests/testthat/test_mllib.R:655
- 2: dispatchFunc("read.text(path)", x, ...)
- 3: f(x, ...)
- 4: callJMethod(read, "text", paths)
- 5: invokeJava(isStatic = FALSE, objId$id, methodName, ...)
- 6: stop(readString(conn))
-```
diff --git a/dev/appveyor-install-dependencies.ps1 
b/dev/appveyor-install-dependencies.ps1
deleted file mode 100644
index b37f1ee45f30..000000000000
--- a/dev/appveyor-install-dependencies.ps1
+++ /dev/null
@@ -1,153 +0,0 @@
-<#
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-  http://www.apache.org/licenses/LICENSE-2.0
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
-#>
-
-$CRAN = "https://cloud.r-project.org";
-
-Function InstallR {
-  if ( -not(Test-Path Env:\R_ARCH) ) {
-    $arch = "x64"
-  }
-  Else {
-    $arch = $env:R_ARCH
-  }
-
-  $urlPath = ""
-  $latestVer = $(ConvertFrom-JSON $(Invoke-WebRequest 
https://rversions.r-pkg.org/r-release-win).Content).version
-  If ($rVer -ne $latestVer) {
-    $urlPath = ("old/" + $rVer + "/")
-  }
-
-  $rurl = $CRAN + "/bin/windows/base/" + $urlPath + "R-" + $rVer + "-win.exe"
-
-  # Downloading R
-  Start-FileDownload $rurl "R-win.exe"
-
-  # Running R installer
-  Start-Process -FilePath .\R-win.exe -ArgumentList "/VERYSILENT /DIR=C:\R" 
-NoNewWindow -Wait
-
-  $RDrive = "C:"
-  echo "R is now available on drive $RDrive"
-
-  $env:PATH = $RDrive + '\R\bin\' + $arch + ';' + 'C:\MinGW\msys\1.0\bin;' + 
$env:PATH
-
-  # Testing R installation
-  Rscript -e "sessionInfo()"
-}
-
-Function InstallRtools {
-  $rtoolsver = $rToolsVer.Split('.')[0..1] -Join ''
-  $rtoolsurl = $CRAN + "/bin/windows/Rtools/rtools$rtoolsver-x86_64.exe"
-
-  # Downloading Rtools
-  Start-FileDownload $rtoolsurl "Rtools-current.exe"
-
-  # Running Rtools installer
-  Start-Process -FilePath .\Rtools-current.exe -ArgumentList /VERYSILENT 
-NoNewWindow -Wait
-
-  $RtoolsDrive = "C:"
-  echo "Rtools is now available on drive $RtoolsDrive"
-
-  if ( -not(Test-Path Env:\GCC_PATH) ) {
-    $gccPath = "gcc-4.6.3"
-  }
-  Else {
-    $gccPath = $env:GCC_PATH
-  }
-  $env:PATH = $RtoolsDrive + '\Rtools40\bin;' + $RtoolsDrive + 
'\Rtools40\mingw64\bin;' + $RtoolsDrive + '\Rtools40\' + $gccPath + '\bin;' + 
$env:PATH
-  $env:BINPREF=$RtoolsDrive + '/Rtools40/mingw$(WIN)/bin/'
-}
-
-# create tools directory outside of Spark directory
-$up = (Get-Item -Path ".." -Verbose).FullName
-$tools = "$up\tools"
-if (!(Test-Path $tools)) {
-    New-Item -ItemType Directory -Force -Path $tools | Out-Null
-}
-
-# ========================== Maven
-# Push-Location $tools
-#
-# $mavenVer = "3.9.6"
-# Start-FileDownload 
"https://archive.apache.org/dist/maven/maven-3/$mavenVer/binaries/apache-maven-$mavenVer-bin.zip";
 "maven.zip"
-#
-# # extract
-# Invoke-Expression "7z.exe x maven.zip"
-#
-# # add maven to environment variables
-# $env:PATH = "$tools\apache-maven-$mavenVer\bin;" + $env:PATH
-# $env:M2_HOME = "$tools\apache-maven-$mavenVer"
-# $env:MAVEN_OPTS = "-Xmx2g -XX:ReservedCodeCacheSize=1g"
-#
-# Pop-Location
-
-Push-Location $tools
-
-# ========================== Java 17
-$zuluFileName="zulu17.44.53-ca-jdk17.0.8.1-win_x64"
-Start-FileDownload "https://cdn.azul.com/zulu/bin/$zuluFileName.zip"; "zulu.zip"
-
-# extract
-Invoke-Expression "7z.exe x zulu.zip"
-
-#add java 17 to environment variables
-$env:JAVA_HOME = "$tools\$zuluFileName"
-$env:PATH = "$JAVA_HOME\bin;" + $env:PATH
-
-# ========================== SBT
-$sbtVer = "1.9.3"
-Start-FileDownload 
"https://github.com/sbt/sbt/releases/download/v$sbtVer/sbt-$sbtVer.zip"; 
"sbt.zip"
-
-# extract
-Invoke-Expression "7z.exe x sbt.zip"
-
-# add sbt to environment variables
-$env:PATH = "$tools\sbt\bin;" + $env:PATH
-
-Pop-Location
-
-# ========================== Hadoop bin package
-# This must match the version at 
https://github.com/cdarlint/winutils/tree/master/hadoop-3.3.5
-$hadoopVer = "3.3.5"
-$hadoopPath = "$tools\hadoop"
-if (!(Test-Path $hadoopPath)) {
-    New-Item -ItemType Directory -Force -Path $hadoopPath | Out-Null
-}
-Push-Location $hadoopPath
-
-Start-FileDownload "https://codeload.github.com/cdarlint/winutils/zip/master"; 
"winutils-master.zip"
-
-# extract
-Invoke-Expression "7z.exe x winutils-master.zip"
-
-# add hadoop bin to environment variables
-$env:HADOOP_HOME = "$hadoopPath\winutils-master\hadoop-$hadoopVer"
-$env:PATH = "$env:HADOOP_HOME\bin;" + $env:PATH
-
-Pop-Location
-
-# ========================== R
-$rVer = "4.3.2"
-$rToolsVer = "4.0.2"
-
-InstallR
-InstallRtools
-
-$env:R_LIBS_USER = 'c:\RLibrary'
-if ( -not(Test-Path $env:R_LIBS_USER) ) {
-  mkdir $env:R_LIBS_USER
-}
-
diff --git a/dev/sparktestsupport/utils.py b/dev/sparktestsupport/utils.py
index 441ae3cf8be5..8215628c1942 100755
--- a/dev/sparktestsupport/utils.py
+++ b/dev/sparktestsupport/utils.py
@@ -34,20 +34,17 @@ def determine_modules_for_files(filenames):
     Given a list of filenames, return the set of modules that contain those 
files.
     If a file is not associated with a more specific submodule, then this 
method will consider that
     file to belong to the 'root' module. `.github` directory is counted only 
in GitHub Actions,
-    and `appveyor.yml` is always ignored because this file is dedicated only 
to AppVeyor builds,
-    and `README.md` is always ignored too.
+    and `README.md` is always ignored.
 
     >>> sorted(x.name for x in 
determine_modules_for_files(["python/pyspark/a.py", "sql/core/foo"]))
     ['pyspark-core', 'pyspark-errors', 'sql']
     >>> [x.name for x in 
determine_modules_for_files(["file_not_matched_by_any_subproject"])]
     ['root']
-    >>> [x.name for x in determine_modules_for_files(["appveyor.yml", 
"sql/README.md"])]
+    >>> [x.name for x in determine_modules_for_files(["sql/README.md"])]
     []
     """
     changed_modules = set()
     for filename in filenames:
-        if filename in ("appveyor.yml",):
-            continue
         if filename.endswith("README.md"):
             continue
         if ("GITHUB_ACTIONS" not in os.environ) and 
filename.startswith(".github"):
diff --git a/project/build.properties b/project/build.properties
index e883bb7bdf3f..41f6be168792 100644
--- a/project/build.properties
+++ b/project/build.properties
@@ -14,5 +14,4 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 #
-# Please update the version in appveyor-install-dependencies.ps1 together.
 sbt.version=1.9.3


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to