dombizita commented on code in PR #9175: URL: https://github.com/apache/ozone/pull/9175#discussion_r2447777226
########## hadoop-hdds/docs/content/interface/HttpFS.md: ########## @@ -45,24 +45,102 @@ HttpFS has built-in security supporting Hadoop pseudo authentication and Kerbero HttpFS service itself is a Jetty based web-application that uses the Hadoop FileSystem API to talk to the cluster, it is a separate service which provides access to Ozone via a REST APIs. It should be started in addition to other regular Ozone components. -To try it out, you can start a Docker Compose dev cluster that has an HttpFS gateway. +To try it out, follow the instruction from the link below and start the Ozone cluster with Docker Compose. Review Comment: nit ```suggestion To try it out, follow the instructions from the link below to start the Ozone cluster with Docker Compose. ``` ########## hadoop-hdds/docs/content/interface/HttpFS.md: ########## @@ -45,24 +45,102 @@ HttpFS has built-in security supporting Hadoop pseudo authentication and Kerbero HttpFS service itself is a Jetty based web-application that uses the Hadoop FileSystem API to talk to the cluster, it is a separate service which provides access to Ozone via a REST APIs. It should be started in addition to other regular Ozone components. -To try it out, you can start a Docker Compose dev cluster that has an HttpFS gateway. +To try it out, follow the instruction from the link below and start the Ozone cluster with Docker Compose. -Extract the release tarball, go to the `compose/ozone` directory and start the cluster: +https://ozone.apache.org/docs/edge/start/startfromdockerhub.html ```bash -docker-compose up -d --scale datanode=3 +docker compose up -d --scale datanode=3 ``` -You can/should find now the HttpFS gateway in docker with the name `ozone_httpfs`. -HttpFS HTTP web-service API calls are HTTP REST calls that map to an Ozone file system operation. For example, using the `curl` Unix command. +You can/should find now the HttpFS gateway in docker with the name like `ozone_httpfs`, +and it can be accessed through `localhost:14000`. Review Comment: Maybe we can mention that 14000 is the default port and it can be changes with the `httpfs.http.port` config. ########## hadoop-hdds/docs/content/interface/HttpFS.md: ########## @@ -45,24 +45,102 @@ HttpFS has built-in security supporting Hadoop pseudo authentication and Kerbero HttpFS service itself is a Jetty based web-application that uses the Hadoop FileSystem API to talk to the cluster, it is a separate service which provides access to Ozone via a REST APIs. It should be started in addition to other regular Ozone components. -To try it out, you can start a Docker Compose dev cluster that has an HttpFS gateway. +To try it out, follow the instruction from the link below and start the Ozone cluster with Docker Compose. -Extract the release tarball, go to the `compose/ozone` directory and start the cluster: +https://ozone.apache.org/docs/edge/start/startfromdockerhub.html Review Comment: I'd suggest to link this here https://ozone.apache.org/docs/edge/start/runningviadocker.html What do you think? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
