This is an automated email from the ASF dual-hosted git repository.
jiayu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-sedona.git
The following commit(s) were added to refs/heads/master by this push:
new f71c37c Update docs
f71c37c is described below
commit f71c37c36d4b85e9ac95335e957612d2c95cf0f4
Author: Jia Yu <[email protected]>
AuthorDate: Thu Jan 7 00:42:58 2021 -0800
Update docs
---
docs/download/compile.md | 62 +++++++++++++-----------------------------------
1 file changed, 17 insertions(+), 45 deletions(-)
diff --git a/docs/download/compile.md b/docs/download/compile.md
index 1ac83b9..7907a94 100644
--- a/docs/download/compile.md
+++ b/docs/download/compile.md
@@ -6,7 +6,7 @@
## Compile Scala and Java source code
Sedona Scala/Java code is a project with four modules, core, sql, viz and
python adapter. Each module is a Scala/Java mixed project which is managed by
Apache Maven 3.
-* Make sure your machine has Java 1.8 and Apache Maven 3.
+* Make sure your Linux/Mac machine has Java 1.8, Apache Maven 3.3.1+, and
Python3. The compilation of Sedona is not tested on Windows machine.
To compile all modules, please make sure you are in the root folder of three
modules. Then enter the following command in the terminal:
@@ -138,45 +138,33 @@ The detailed requirement is on [ASF Infra
website](https://infra.apache.org/publ
#### Prepare for Spark 3.0 and Scala 2.12
-1. Convert source code to Spark 3 format
-```
-python3 spark-version-converter.py spark3
-```
-2. Prepare the SNAPSHOTs
+1. Prepare the SNAPSHOTs
```
mvn clean -Darguments="-DskipTests" release:prepare -DdryRun=true
-DautoVersionSubmodules=true -Dresume=false
```
-3. Deploy the SNAPSHOTs
+2. Deploy the SNAPSHOTs
```
mvn deploy -DskipTests
```
#### Prepare for Spark 2.4 and Scala 2.11
-1. Convert source code to Spark 2 format
-```
-python3 spark-version-converter.py spark2
-```
-2. Prepare the SNAPSHOTs
+1. Prepare the SNAPSHOTs
```
mvn clean release:prepare -DdryRun=true -DautoVersionSubmodules=true
-Dresume=false
-DcheckModificationExcludeList=sql/src/main/scala/org/apache/sedona/sql/UDF/UdfRegistrator.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/JoinQueryDetector.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/TraitJoinQueryExec.scala
-Darguments="-DskipTests -Dscala=2.11 -Dspark=2.4"
```
-3. Deploy the SNAPSHOTs
+2. Deploy the SNAPSHOTs
```
mvn deploy -DskipTests -Dscala=2.11 -Dspark=2.4
```
#### Prepare for Spark 2.4 and Scala 2.12
-1. Convert source code to Spark 2 format
-```
-python3 spark-version-converter.py spark2
-```
-2. Prepare the SNAPSHOTs
+1. Prepare the SNAPSHOTs
```
mvn clean release:prepare -DdryRun=true -DautoVersionSubmodules=true
-Dresume=false
-DcheckModificationExcludeList=sql/src/main/scala/org/apache/sedona/sql/UDF/UdfRegistrator.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/JoinQueryDetector.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/TraitJoinQueryExec.scala
-Darguments="-DskipTests -Dscala=2.12 -Dspark=2.4"
```
-3. Deploy the SNAPSHOTs
+2. Deploy the SNAPSHOTs
```
mvn deploy -DskipTests -Dscala=2.12 -Dspark=2.4
```
@@ -189,60 +177,44 @@ mvn deploy -DskipTests -Dscala=2.12 -Dspark=2.4
#### For Spark 3.0 and Scala 2.12
-1. Convert source code to Spark 3 format
-```bash
-python3 spark-version-converter.py spark3
-```
-2. Prepare a release. Manually enter the following variables in the terminal:
release id: ==1.0.0-incubator==, scm tag id: ==sedona-1.0.0-incubator== (this
is just an example. Please use the correct version number). You also need to
provide GitHub username and password three times.
+1. Prepare a release. Manually enter the following variables in the terminal:
release id: ==1.0.0-incubator==, scm tag id: ==sedona-1.0.0-incubator== (this
is just an example. Please use the correct version number). You also need to
provide GitHub username and password three times.
```bash
mvn clean release:prepare -DautoVersionSubmodules=true -Dresume=false
-Darguments="-DskipTests"
```
-3. Stage a release
+2. Stage a release
```bash
mvn clean release:perform -DautoVersionSubmodules=true -Dresume=false
-Darguments="-DskipTests"
```
-4. Now the releases are staged. A tag and two commits will be created on
Sedona GitHub repo.
-5. Delete the scm tag on GitHub and we will only keep the tag created by the
last compilation target.
+3. Now the releases are staged. A tag and two commits will be created on
Sedona GitHub repo.
+4. Delete the scm tag on GitHub and we will only keep the tag created by the
last compilation target.
Now let's repeat the process to other Sedona modules.
#### For Spark 2.4 and Scala 2.11
-1. Convert source code to Spark 2 format
-```bash
-python3 spark-version-converter.py spark2
-```
-2. Manuallly commit the changes of the three scala files to GitHub
-3. Prepare a release. Note that: release id: ==1.0.0-incubator==, scm tag id:
==sedona-1.0.0-incubator== (this is just an example. Please use the correct
version number)
+1. Prepare a release. Note that: release id: ==1.0.0-incubator==, scm tag id:
==sedona-1.0.0-incubator== (this is just an example. Please use the correct
version number)
```bash
mvn clean release:prepare -DautoVersionSubmodules=true -Dresume=false
-DcheckModificationExcludeList=sql/src/main/scala/org/apache/sedona/sql/UDF/UdfRegistrator.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/JoinQueryDetector.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/TraitJoinQueryExec.scala
-Darguments="-DskipTests -Dscala=2.11 -Dspark=2.4"
```
-4. Stage a release
+2. Stage a release
```bash
mvn clean release:perform -DautoVersionSubmodules=true -Dresume=false
-Darguments="-DskipTests -Dscala=2.11 -Dspark=2.4"
```
-5. Delete the scm tag on GitHub and we will only keep the tag created by the
last compilation target.
+3. Delete the scm tag on GitHub and we will only keep the tag created by the
last compilation target.
#### For Spark 2.4 and Scala 2.12
-Step 1 and 2 are only needed if you didn't run the previous step before
-
-1. Convert source code to Spark 2 format
-```bash
-python3 spark-version-converter.py spark2
-```
-2. ==Manuallly commit the changes of the three scala files to GitHub==
-3. Prepare a release: release id: ==1.0.0-incubator==, scm tag id:
==sedona-1.0.0-incubator== (this is just an example. Please use the correct
version number)
+1. Prepare a release: release id: ==1.0.0-incubator==, scm tag id:
==sedona-1.0.0-incubator== (this is just an example. Please use the correct
version number)
```bash
mvn clean release:prepare -DautoVersionSubmodules=true -Dresume=false
-DcheckModificationExcludeList=sql/src/main/scala/org/apache/sedona/sql/UDF/UdfRegistrator.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/JoinQueryDetector.scala,sql/src/main/scala/org/apache/spark/sql/sedona_sql/strategy/join/TraitJoinQueryExec.scala
-Darguments="-DskipTests -Dscala=2.12 -Dspark=2.4"
```
-4. Stage a release
+2. Stage a release
```bash
mvn clean release:perform -DautoVersionSubmodules=true -Dresume=false
-Darguments="-DskipTests -Dscala=2.12 -Dspark=2.4"
```
!!!warning
- After staged the three releases, you need to manually revert the
commited three scala files. You will see 6 [maven-release-plugin] commits and 1
more tag in Sedona GitHub repo.
+ After staged the three releases, you will see 6 [maven-release-plugin]
commits and 1 more tag in Sedona GitHub repo.
### Close the staging repo
1. Check the status of the staging repo: [Locate and Examine Your Staging
Repository