This is an automated email from the ASF dual-hosted git repository.

maxgekk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 7eec8be37a87 [MINOR][DOCS] Adjust formatting of "Building Spark" docs
7eec8be37a87 is described below

commit 7eec8be37a871cf62b2cb0b22be0734bb1a31041
Author: Nicholas Chammas <nicholas.cham...@gmail.com>
AuthorDate: Sun Jan 21 23:44:36 2024 +0300

    [MINOR][DOCS] Adjust formatting of "Building Spark" docs
    
    ### What changes were proposed in this pull request?
    
    Update the "Building Spark" docs as follows:
    - Add mention of Scaleway to the Jenkins section so developers understand 
its relevance (since we retired the project's own Jenkins infra some time back).
    - Add syntax highlighting to some of the code blocks.
    - Adjust the headings of some sections that were set too deep.
    
    ### Why are the changes needed?
    
    This makes the guide a bit more up-to-date and easier to read.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, it changes the public documentation on building Spark.
    
    ### How was this patch tested?
    
    I manually built the docs and reviewed the results.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #44804 from nchammas/building-spark-docs.
    
    Authored-by: Nicholas Chammas <nicholas.cham...@gmail.com>
    Signed-off-by: Max Gekk <max.g...@gmail.com>
---
 docs/building-spark.md | 38 +++++++++++++++++++++++++-------------
 1 file changed, 25 insertions(+), 13 deletions(-)

diff --git a/docs/building-spark.md b/docs/building-spark.md
index 9fea22426978..3d12b521c024 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -34,7 +34,9 @@ Spark requires Scala 2.13; support for Scala 2.12 was removed 
in Spark 4.0.0.
 
 You'll need to configure Maven to use more memory than usual by setting 
`MAVEN_OPTS`:
 
-    export MAVEN_OPTS="-Xss64m -Xmx2g -XX:ReservedCodeCacheSize=1g"
+```sh
+export MAVEN_OPTS="-Xss64m -Xmx2g -XX:ReservedCodeCacheSize=1g"
+```
 
 (The `ReservedCodeCacheSize` setting is optional but recommended.)
 If you don't add these parameters to `MAVEN_OPTS`, you may see errors and 
warnings like the following:
@@ -171,12 +173,16 @@ assembly JAR (for developers who build with SBT).  For 
more information about ho
 
 When building on an encrypted filesystem (if your home directory is encrypted, 
for example), then the Spark build might fail with a "Filename too long" error. 
As a workaround, add the following in the configuration args of the 
`scala-maven-plugin` in the project `pom.xml`:
 
-    <arg>-Xmax-classfile-name</arg>
-    <arg>128</arg>
+```xml
+<arg>-Xmax-classfile-name</arg>
+<arg>128</arg>
+```
 
 and in `project/SparkBuild.scala` add:
 
-    scalacOptions in Compile ++= Seq("-Xmax-classfile-name", "128"),
+```scala
+scalacOptions in Compile ++= Seq("-Xmax-classfile-name", "128"),
+```
 
 to the `sharedSettings` val. See also [this 
PR](https://github.com/apache/spark/pull/2883/files) if you are unsure of where 
to add these lines.
 
@@ -214,7 +220,7 @@ If you are building Spark for use in a Python environment 
and you wish to pip in
 
 **Note:** Due to packaging requirements you can not directly pip install from 
the Python directory, rather you must first build the sdist package as 
described above.
 
-Alternatively, you can also run make-distribution with the --pip option.
+Alternatively, you can also run `make-distribution.sh` with the `--pip` option.
 
 ## PySpark Tests with Maven or SBT
 
@@ -276,11 +282,15 @@ Enable the profile (e.g. 2.13):
 
 ## Running Jenkins tests with GitHub Enterprise
 
+While the Spark project does not maintain its own Jenkins infrastructure, 
[community members like Scaleway][scaleway] do.
+
+[scaleway]: https://spark.apache.org/developer-tools.html#scaleway
+
 To run tests with Jenkins:
 
     ./dev/run-tests-jenkins
 
-If use an individual repository or a repository on GitHub Enterprise, export 
below environment variables before running above command.
+If you use an individual repository or a repository on GitHub Enterprise, 
export the environment variables below before running the above command.
 
 ### Related environment variables
 
@@ -302,7 +312,7 @@ If use an individual repository or a repository on GitHub 
Enterprise, export bel
 </tr>
 </table>
 
-### Building and testing on IPv6-only environment
+# Building and testing on an IPv6-only environment
 
 Use Apache Spark GitBox URL because GitHub doesn't support IPv6 yet.
 
@@ -310,13 +320,15 @@ Use Apache Spark GitBox URL because GitHub doesn't 
support IPv6 yet.
 
 To build and run tests on IPv6-only environment, the following configurations 
are required.
 
-    export SPARK_LOCAL_HOSTNAME="your-IPv6-address" # e.g. 
'[2600:1700:232e:3de0:...]'
-    export DEFAULT_ARTIFACT_REPOSITORY=https://ipv6.repo1.maven.org/maven2/
-    export MAVEN_OPTS="-Djava.net.preferIPv6Addresses=true"
-    export SBT_OPTS="-Djava.net.preferIPv6Addresses=true"
-    export SERIAL_SBT_TESTS=1
+```sh
+export SPARK_LOCAL_HOSTNAME="your-IPv6-address" # e.g. 
'[2600:1700:232e:3de0:...]'
+export DEFAULT_ARTIFACT_REPOSITORY=https://ipv6.repo1.maven.org/maven2/
+export MAVEN_OPTS="-Djava.net.preferIPv6Addresses=true"
+export SBT_OPTS="-Djava.net.preferIPv6Addresses=true"
+export SERIAL_SBT_TESTS=1
+```
 
-### Building with user-defined `protoc`
+# Building with a user-defined `protoc`
 
 When the user cannot use the official `protoc` binary files to build the 
`core` module in the compilation environment, for example, compiling `core` 
module on CentOS 6 or CentOS 7 which the default `glibc` version is less than 
2.14, we can try to compile and test by specifying the user-defined `protoc` 
binary files as follows:
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to