unknowntpo commented on code in PR #9175: URL: https://github.com/apache/ozone/pull/9175#discussion_r2458571445
########## hadoop-hdds/docs/content/interface/HttpFS.md: ########## @@ -45,24 +45,102 @@ HttpFS has built-in security supporting Hadoop pseudo authentication and Kerbero HttpFS service itself is a Jetty based web-application that uses the Hadoop FileSystem API to talk to the cluster, it is a separate service which provides access to Ozone via a REST APIs. It should be started in addition to other regular Ozone components. -To try it out, you can start a Docker Compose dev cluster that has an HttpFS gateway. +To try it out, follow the instruction from the link below and start the Ozone cluster with Docker Compose. -Extract the release tarball, go to the `compose/ozone` directory and start the cluster: +https://ozone.apache.org/docs/edge/start/startfromdockerhub.html Review Comment: @dombizita I have a different opinion. This doc is located at official website of `apache/ozone`, and I believe this should be beginner friendly. For people who wants to try out ozone, they can easily follow `https://ozone.apache.org/docs/edge/start/startfromdockerhub.html` to download docker compose file and they are good to go. If we put https://ozone.apache.org/docs/edge/start/runningviadocker.html at here, this means that they need to: 1. Clone entire `apache/ozone` repository. 2. Build distribution. 3. Go to `hadoop-ozone/dist/src/main/compose/ozone/docker-compose.yaml` and execute `docker compose up -d`. Which is not very beginner friendly. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
