This is an automated email from the ASF dual-hosted git repository. danhaywood pushed a commit to branch release-2.0.0-RC3-RC1 in repository https://gitbox.apache.org/repos/asf/causeway.git
commit 2ed85a58b6e96d757e70ad188ee55d05fc077ff6 Author: danhaywood <[email protected]> AuthorDate: Tue Oct 10 06:45:48 2023 +0100 CAUSEWAY-3528 : further minor doc improvements --- .../ROOT/partials/publish-and-index-website.adoc | 20 ++++++++++++++------ .../partials/module-nav/demos-and-tutorials.adoc | 3 +-- 2 files changed, 15 insertions(+), 8 deletions(-) diff --git a/antora/components/comguide/modules/ROOT/partials/publish-and-index-website.adoc b/antora/components/comguide/modules/ROOT/partials/publish-and-index-website.adoc index a022ba0bdc..fb806e82bb 100644 --- a/antora/components/comguide/modules/ROOT/partials/publish-and-index-website.adoc +++ b/antora/components/comguide/modules/ROOT/partials/publish-and-index-website.adoc @@ -58,25 +58,33 @@ git push origin asf-site [#update-the-algolia-search-index] == Update the Algolia search index -Create a `algolia.env` file holding the `APP_ID` and the admin `API_KEY`, in the root of `causeway-site`: +We use link:https://docsearch.algolia.com[Algolia] to build our search index. +* If required, create a `algolia.env` file holding the `APP_ID` and the admin `API_KEY`, in the root of `causeway-site`: ++ [source,ini] .algolia.env ---- APPLICATION_ID=... API_KEY=... ---- - ++ CAUTION: This file should not be checked into the repo, because the API_KEY allows the index to be modified or deleted. -We use the Algolia-provided link:https://hub.docker.com/r/algolia/docsearch-scraper[docker image] for the crawler to perform the search (as per the link:as per https://docsearch.algolia.com/docs/run-your-own/#run-the-crawl-from-the-docker-image[docs]): +* If required, update the `algolia-config.json` file ++ +For example, update the `stop_urls` property with any paths that should not be crawled. +* Use the Algolia-provided link:https://hub.docker.com/r/algolia/docsearch-scraper[docker image] to crawl the web pages and create the search index: ++ [source,bash] ---- -cd content +pushd content docker run -it --env-file=../algolia.env -e "CONFIG=$(cat ../algolia-config.json | jq -r tostring)" algolia/docsearch-scraper:v1.16.0 +popd ---- - ++ This posts the index up to the link:https://algolia.com[Algolia] site. ++ +NOTE: Further documentation on the crawler can be found link:as per https://docsearch.algolia.com/docs/run-your-own/#run-the-crawl-from-the-docker-image[here]; additional config options for the crawler can be found link:https://www.algolia.com/doc/api-reference/crawler/[here]. -NOTE: Additional config options for the crawler can be found link:https://www.algolia.com/doc/api-reference/crawler/[here]. diff --git a/antora/components/docs/modules/ROOT/partials/module-nav/demos-and-tutorials.adoc b/antora/components/docs/modules/ROOT/partials/module-nav/demos-and-tutorials.adoc index 73ebb2e1d6..1a6eab9849 100644 --- a/antora/components/docs/modules/ROOT/partials/module-nav/demos-and-tutorials.adoc +++ b/antora/components/docs/modules/ROOT/partials/module-nav/demos-and-tutorials.adoc @@ -5,8 +5,7 @@ * Learning & Tutorials include::referenceapp:partial$module-nav.adoc[] - -** xref:tutorials:petclinic:about.adoc[Petclinic] +include::tutorials:petclinic:partial$module-nav.adoc[]
