HyukjinKwon opened a new pull request, #36813:
URL: https://github.com/apache/spark/pull/36813

   ### What changes were proposed in this pull request?
   
   This PR fixes the Sphinx build failure below (see 
https://github.com/singhpk234/spark/runs/6799026458?check_suite_focus=true): 
   
   ```
   Moving to python/docs directory and building sphinx.
   Running Sphinx v3.0.4
   WARNING:root:'PYARROW_IGNORE_TIMEZONE' environment variable was not set. It 
is required to set this environment variable to '1' in both driver and executor 
sides if you use pyarrow>=2.0.0. pandas-on-Spark will set it for you but it 
does not work if there is a Spark context already launched.
   /__w/spark/spark/python/pyspark/pandas/supported_api_gen.py:101: 
UserWarning: Warning: Latest version of pandas(>=1.4.0) is required to generate 
the documentation; however, your version was 1.3.5
     warnings.warn(
   Warning, treated as error:
   node class 'meta' is already registered, its visitors will be overridden
   make: *** [Makefile:35: html] Error 2
                       ------------------------------------------------
         Jekyll 4.2.1   Please append `--trace` to the `build` command 
                        for any additional information or backtrace. 
                       ------------------------------------------------
   ```
   
   Sphinx build fails apparently with the latest docutils (see also 
https://issues.apache.org/jira/browse/FLINK-24662). we should pin the version.
   
   ### Why are the changes needed?
   
   To recover the CI.
   
   ### Does this PR introduce _any_ user-facing change?
   
   No, dev-only.
   
   ### How was this patch tested?
   
   CI in this PR should test it out.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to