nchammas commented on code in PR #51305:
URL: https://github.com/apache/spark/pull/51305#discussion_r2173405412


##########
docs/spark-connect-setup.md:
##########
@@ -0,0 +1,326 @@
+---
+layout: global
+title: Setting up Spark Connect
+license: |
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+---
+
+Spark Connect supports PySpark and Scala applications. We will walk through 
how to run an
+Apache Spark server with Spark Connect and connect to it from a client 
application using the
+Spark Connect client library.
+
+* This will become a table of contents (this text will be scraped).
+{:toc}
+
+## Download and start Spark server with Spark Connect
+
+First, download Spark from the
+[Download Apache Spark](https://spark.apache.org/downloads.html) page. Choose 
the
+latest release in  the release drop down at the top of the page. Then choose 
your package type, typically
+“Pre-built for Apache Hadoop 3.3 and later”, and click the link to download.
+
+Now extract the Spark package you just downloaded on your computer, for 
example:
+
+```bash
+tar -xvf spark-{{site.SPARK_VERSION_SHORT}}-bin-hadoop3.tgz
+```
+
+In a terminal window, go to the `spark` folder in the location where you 
extracted
+Spark before and run the `start-connect-server.sh` script to start Spark 
server with
+Spark Connect. If you already have Spark installed and `SPARK_HOME` defined, 
you can use that too.
+
+```bash
+cd spark/
+./sbin/start-connect-server.sh
+
+# alternately
+"$SPARK_HOME/sbin/start-connect-server.sh"
+```

Review Comment:
   I believe the general installation instructions are on [this 
page](https://spark.apache.org/docs/latest/index.html). 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to