Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405240579 ## dev/requirements.txt: ## @@ -31,12 +31,12 @@ pandas-stubs<1.2.0.54 mkdocs # Documentation (Python) -pydata_sphinx_theme +pydata_sphinx_theme==0.13 Review Comment: I followed [the version used in Pandas](https://github.com/pandas-dev/pandas/blob/main/requirements-dev.txt#L64), and actually I believe this version render the document in the most optimal form after doing several version testing. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405240579 ## dev/requirements.txt: ## @@ -31,12 +31,12 @@ pandas-stubs<1.2.0.54 mkdocs # Documentation (Python) -pydata_sphinx_theme +pydata_sphinx_theme==0.13 Review Comment: I followed [the version used in Pandas](https://github.com/pandas-dev/pandas/blob/main/requirements-dev.txt#L64), and actually I believe this version render the document in the most optimal form. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405240335 ## dev/requirements.txt: ## @@ -31,12 +31,12 @@ pandas-stubs<1.2.0.54 mkdocs # Documentation (Python) -pydata_sphinx_theme +pydata_sphinx_theme==0.13 Review Comment: I followed [the version used in Pandas](https://github.com/pandas-dev/pandas/blob/main/requirements-dev.txt#L64), and actually tested several versions, rendering the document in the most attractive form. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405240335 ## dev/requirements.txt: ## @@ -31,12 +31,12 @@ pandas-stubs<1.2.0.54 mkdocs # Documentation (Python) -pydata_sphinx_theme +pydata_sphinx_theme==0.13 Review Comment: I followed [the version used in Pandas](https://github.com/pandas-dev/pandas/blob/main/requirements-dev.txt#L64), and actually tested several versions, rendering the document in the most attractive form. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405239729 ## dev/requirements.txt: ## @@ -31,12 +31,12 @@ pandas-stubs<1.2.0.54 mkdocs # Documentation (Python) -pydata_sphinx_theme +pydata_sphinx_theme==0.13 ipython nbsphinx numpydoc -jinja2<3.0.0 -sphinx<3.1.0 +jinja2 +sphinx==4.2.0 Review Comment: I tested that other Sphinx versions do not generate documentation properly for some reason. I have tested as many combinations as possible with Jinja2 and pydata_sphinx_theme, but I have confirmed that Sphinx version 4.2.0 currently renders documents in the most optimal form. Will investigate further in the future to support the latest Sphinx If necessary. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405238199 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -319,8 +320,8 @@ specific plotting methods of the form ``DataFrame.plot.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst - DataFrame.plot Review Comment: In newer versions of Sphinx, the build will fail because `DataFrame.plot` and `Series.plot` are determined to be duplicates of the list of functions described below such as `DataFrame.plot.area`, `DataFrame.plot.barh`, `DataFrame.plot.bar`, etc. In fact, this behavior seems reasonable since `.plot` is simply an accessor keyword and not a function, so I believe we can just simply leave it out of the document. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405238199 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -319,8 +320,8 @@ specific plotting methods of the form ``DataFrame.plot.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst - DataFrame.plot Review Comment: In newer versions of Sphinx, the build will fail because `DataFrame.plot` and `Series.plot` are determined to be duplicates of the list of functions described below. In fact, this behavior seems reasonable since `.plot` is simply an accessor keyword and not a function, so I believe we can just simply leave it out of the document. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405236858 ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude `uid` from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property unexpectedly. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405236561 ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude `uid` from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property unexpectedly. ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude `uid` from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property unexpectedly. ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude uid from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property unexpectedly. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405236561 ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude `uid` from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property which is not correct. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405236561 ## python/docs/source/_templates/autosummary/class_with_docs.rst: ## @@ -47,7 +47,9 @@ .. autosummary:: {% for item in attributes %} - ~{{ name }}.{{ item }} +{% if not (item == 'uid') %} Review Comment: We should manually exclude `uid` from documentation because it is an internal property. We don't include them our current documentation as well, but for some reason newer Sphinx version trying to generate the internal property. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405234673 ## python/docs/source/conf.py: ## @@ -194,7 +194,11 @@ # further. For a list of options available for each theme, see the # documentation. html_theme_options = { -"navbar_end": ["version-switcher"] +"navbar_end": ["version-switcher", "theme-switcher"], +"logo": { +"image_light": "_static/spark-logo-light.png", +"image_dark": "_static/spark-logo-dark.png", Review Comment: FYI: The default mode for light/dark is `auto`, which will choose a theme based on the system settings from user, but we can specify one of `dark` or `light` as default manually. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405234673 ## python/docs/source/conf.py: ## @@ -194,7 +194,11 @@ # further. For a list of options available for each theme, see the # documentation. html_theme_options = { -"navbar_end": ["version-switcher"] +"navbar_end": ["version-switcher", "theme-switcher"], +"logo": { +"image_light": "_static/spark-logo-light.png", +"image_dark": "_static/spark-logo-dark.png", Review Comment: FYI: The default mode for light/dark is `auto`, which will choose a theme based on the system settings from user, but we can choose one of `dark` or `light` as default manually. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405233405 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. That's why we should use the customized template to correct the module path. See also https://github.com/sphinx-doc/sphinx/issues/7551. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. See also https://github.com/sphinx-doc/sphinx/issues/7551. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. See https://github.com/sphinx-doc/sphinx/issues/7551 related issue. Therefore, we should use the customized template such as Pandas does. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. See also https://github.com/sphinx-doc/sphinx/issues/7551. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. See https://github.com/sphinx-doc/sphinx/issues/7551 related issue. Therefore, I modified the path to be customized for these cases using a custom template such as Pandas does. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. Therefore, I modified the path to be customized for these cases using a custom template such as Pandas does. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405232095 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: For example, previously the rst file was created as follows: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql.SparkSession .. automethod:: builder.appName ``` However, in newer Sphinx versions it is generated like this: ```rst pyspark.sql.SparkSession.builder.appName .. currentmodule:: pyspark.sql .. automethod:: SparkSession.builder.appName ``` In the case of functions used through internal classes or accessors like this, the package paths created in a new way will cause Sphinx build to fail. Therefore, I modified the path to be customized for these cases using a custom template such as Pandas does. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
Re: [PR] [SPARK-46103][PYTHON][BUILD][DOCS] Enhancing PySpark documentation [spark]
itholic commented on code in PR #44012: URL: https://github.com/apache/spark/pull/44012#discussion_r1405231062 ## python/docs/source/reference/pyspark.pandas/frame.rst: ## @@ -299,6 +299,7 @@ in Spark. These can be accessed by ``DataFrame.spark.``. .. autosummary:: :toctree: api/ + :template: autosummary/accessor_method.rst Review Comment: With the new version of Sphinx, the package name creation rules for `rst` files that are automatically created when building documents have changed, so we must manually adjust the package path using these templates. This behavior is [used in the same way in Pandas](https://github.com/pandas-dev/pandas/tree/main/doc/_templates/autosummary), so I referred to it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org