[ https://issues.apache.org/jira/browse/SPARK-39421?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-39421. ---------------------------------- Fix Version/s: 3.3.0 3.2.2 3.4.0 Assignee: Hyukjin Kwon Resolution: Fixed Fixed in https://github.com/apache/spark/pull/36813 > Sphinx build fails with "node class 'meta' is already registered, its > visitors will be overridden" > -------------------------------------------------------------------------------------------------- > > Key: SPARK-39421 > URL: https://issues.apache.org/jira/browse/SPARK-39421 > Project: Spark > Issue Type: Bug > Components: Documentation > Affects Versions: 3.0.3, 3.1.2, 3.2.1, 3.3.0, 3.4.0 > Reporter: Hyukjin Kwon > Assignee: Hyukjin Kwon > Priority: Major > Fix For: 3.3.0, 3.2.2, 3.4.0 > > > {code} > Moving to python/docs directory and building sphinx. > Running Sphinx v3.0.4 > WARNING:root:'PYARROW_IGNORE_TIMEZONE' environment variable was not set. It > is required to set this environment variable to '1' in both driver and > executor sides if you use pyarrow>=2.0.0. pandas-on-Spark will set it for you > but it does not work if there is a Spark context already launched. > /__w/spark/spark/python/pyspark/pandas/supported_api_gen.py:101: UserWarning: > Warning: Latest version of pandas(>=1.4.0) is required to generate the > documentation; however, your version was 1.3.5 > warnings.warn( > Warning, treated as error: > node class 'meta' is already registered, its visitors will be overridden > make: *** [Makefile:35: html] Error 2 > ------------------------------------------------ > Jekyll 4.2.1 Please append `--trace` to the `build` command > for any additional information or backtrace. > ------------------------------------------------ > {code} > Sphinx build fails apparently with the latest docutils (see also > https://issues.apache.org/jira/browse/FLINK-24662). we should pin the version. -- This message was sent by Atlassian Jira (v8.20.7#820007) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org