gh-yzou commented on code in PR #1991:
URL: https://github.com/apache/polaris/pull/1991#discussion_r2180963410


##########
plugins/spark/README.md:
##########
@@ -66,23 +63,31 @@ bin/spark-shell \
 --conf spark.sql.sources.useV1SourceList=''
 ```
 
-Assume the path to the built Spark client jar is
-`/polaris/plugins/spark/v3.5/spark/build/2.12/libs/polaris-spark-3.5_2.12-0.11.0-beta-incubating-SNAPSHOT-bundle.jar`
-and the name of the catalog is `polaris`. The cli command will look like 
following:
+The polaris version can be found in versions.txt in the Polaris root project 
dir.
+
+# Build and run with Polaris spark bundle JAR
+The polaris-spark-bundle project is used to build the Polaris Spark bundle 
JAR. The resulting JAR will follow this naming format:
+polaris-spark-bundle-<spark_version>_<scala_version>-<polaris_version>.jar
+For example:
+polaris-spark-bundle-3.5_2.12-1.1.0-incubating-SNAPSHOT.jar
+
+Run `./gradlew assemble` to build the entire Polaris project without running 
tests. After the build completes, 
+the bundle JAR can be found under: 
plugins/spark/v3.5/spark-bundle/build/<scala_version>/libs/.
+To start Spark using the bundle JAR, specify it with the `--jars` option as 
shown below:
 
 ```shell
 bin/spark-shell \
---jars 
/polaris/plugins/spark/v3.5/spark/build/2.12/libs/polaris-spark-3.5_2.12-0.11.0-beta-incubating-SNAPSHOT-bundle.jar
 \
+--jars <path-to-spark-client-jar> \
 --packages 
org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1 \
 --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension
 \
 --conf 
spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog 
\
---conf spark.sql.catalog.polaris.warehouse=<catalog-name> \
---conf 
spark.sql.catalog.polaris.header.X-Iceberg-Access-Delegation=vended-credentials 
\
---conf spark.sql.catalog.polaris=org.apache.polaris.spark.SparkCatalog \
---conf spark.sql.catalog.polaris.uri=http://localhost:8181/api/catalog \
---conf spark.sql.catalog.polaris.credential="root:secret" \
---conf spark.sql.catalog.polaris.scope='PRINCIPAL_ROLE:ALL' \
---conf spark.sql.catalog.polaris.token-refresh-enabled=true \
+--conf spark.sql.catalog.<catalog-name>.warehouse=<catalog-name> \

Review Comment:
   This is the shell command without filling the actual value, i added an 
example in the README with specific value filled. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@polaris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to