hudi-agent commented on code in PR #18687:
URL: https://github.com/apache/hudi/pull/18687#discussion_r3181105171
##########
docker/README.md:
##########
@@ -43,6 +43,24 @@ To build all docker images locally, you can run the script:
./build_local_docker_images.sh
```
+To build the Docker demo images directly with the script, run it from under
`<HUDI_REPO_DIR>/docker`:
+
+```shell
+./build_docker_images.sh
+```
+
+You can override the Hadoop, Spark, and Hive versions from the command line.
Use one of the supported version
+combinations under `docker/compose` when doing so.
+
+```shell
+# For example, to build the image set used by
+# docker-compose_hadoop340_hive313_spark401_{amd64,arm64}.yml
+./build_docker_images.sh --hadoop-version 3.4.0 --spark-version 4.0.1
--hive-version 3.1.3
+```
+
+By default, the script builds images for the current machine architecture and
derives the version tag from the root
Review Comment:
🤖 It might help to call out the specific combination needed for the Hudi
demo flow. `setup_demo.sh` hardcodes
`docker-compose_hadoop334_hive313_spark353_{amd64,arm64}.yml`, which expects
image tags like `apachehudi/hudi-hadoop_3.3.4-...:latest`. As written, a reader
who runs `./build_docker_images.sh` with no flags gets `hudi-hadoop_2.8.4-...`
images (script defaults), and the example here builds `hudi-hadoop_3.4.0-...`
images — neither matches what `setup_demo.sh` will pull. Consider adding an
example like `./build_docker_images.sh --hadoop-version 3.3.4 --hive-version
3.1.3 --spark-version 3.5.3` and noting it's the combo aligned with
`setup_demo.sh`, so users following the demo path know which flags to pass.
<sub><i>- AI-generated; verify before applying. React 👍/👎 to flag
quality.</i></sub>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]