zhengruifeng opened a new pull request, #52008:
URL: https://github.com/apache/spark/pull/52008

   revert https://github.com/apache/spark/pull/52004
   
   since it breaks the doc generation
   
   ```
   Warning, treated as error:
   /__w/spark/spark/python/docs/source/tutorial/sql/arrow_pandas.rst:385:Error 
with CSV data in "csv-table" directive:
   ',' expected after '"'
   
   .. csv-table::
      :header: "SQL Type", "None", "True", "1", "a", "date", "datetime", "1.0", 
"array", "[1]", "(1,)", "bytearray", "Decimal", "dict"
      :widths: 12, 6, 6, 6, 6, 10, 12, 6, 8, 6, 6, 10, 8, 8
   
      "boolean", "None", "True", "True", "X", "X", "X", "True", "X", "X", "X", 
"X", "X", "X"
      "tinyint", "None", "1", "1", "X", "X", "X", "1", "X", "X", "X", "X", "1", 
"X"
      "smallint", "None", "1", "1", "X", "X", "X", "1", "X", "X", "X", "X", 
"1", "X"
      "int", "None", "1", "1", "X", "0", "X", "1", "X", "X", "X", "X", "1", "X"
      "bigint", "None", "1", "1", "X", "X", "0", "1", "X", "X", "X", "X", "1", 
"X"
      "string", "None", "'True'", "'1'", "'a'", "'1970-01-01'", "'1970-01-01 
00:00...'", "'1.0'", "\"array('i', [1])\"", "'[1]'", "'(1,)'", 
"\"bytearray(b'ABC')\"", "'1'", "\"{'a': 1}\""
      "date", "None", "X", "X", "X", "datetime.date(197...)", 
"datetime.date(197...)", "X", "X", "X", "X", "X", "datetime.date(197...)", "X"
      "timestamp", "None", "X", "datetime.datetime...", "X", "X", 
"datetime.datetime...", "X", "X", "X", "X", "X", "datetime.datetime...", "X"
      "float", "None", "1.0", "1.0", "X", "X", "X", "1.0", "X", "X", "X", "X", 
"1.0", "X"
      "double", "None", "1.0", "1.0", "X", "X", "X", "1.0", "X", "X", "X", "X", 
"1.0", "X"
      "binary", "None", "bytearray(b'\\x00')", "bytearray(b'\\x00')", "X", "X", 
"X", "X", "bytearray(b'\\x01\\...", "bytearray(b'\\x01')", 
"bytearray(b'\\x01')", "bytearray(b'ABC')", "X", "X"
      "decimal(10,0)", "None", "X", "X", "X", "X", "X", "Decimal('1')", "X", 
"X", "X", "X", "Decimal('1')", "X"
   make: *** [Makefile:36: html] Error 2
                       ------------------------------------------------
         Jekyll 4.4.1   Please append `--trace` to the `build` command 
                        for any additional information or backtrace. 
                       ------------------------------------------------
   /__w/spark/spark/docs/_plugins/build_api_docs.rb:162:in `build_python_docs': 
Python doc generation failed (RuntimeError)
        from /__w/spark/spark/docs/_plugins/build_api_docs.rb:227:in `<top 
(required)>'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/external.rb:57:in
 `require'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/external.rb:57:in
 `block in require_with_graceful_fail'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/external.rb:55:in
 `each'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/external.rb:55:in
 `require_with_graceful_fail'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/plugin_manager.rb:96:in
 `block in require_plugin_files'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/plugin_manager.rb:94:in
 `each'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/plugin_manager.rb:94:in
 `require_plugin_files'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/plugin_manager.rb:21:in
 `conscientious_require'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/site.rb:131:in
 `setup'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/site.rb:36:in
 `initialize'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/commands/build.rb:30:in
 `new'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/commands/build.rb:30:in
 `process'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/command.rb:91:in
 `block in process_with_graceful_fail'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/command.rb:91:in
 `each'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/command.rb:91:in
 `process_with_graceful_fail'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/lib/jekyll/commands/build.rb:18:in
 `block (2 levels) in init_with_program'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/mercenary-0.4.0/lib/mercenary/command.rb:221:in
 `block in execute'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/mercenary-0.4.0/lib/mercenary/command.rb:221:in
 `each'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/mercenary-0.4.0/lib/mercenary/command.rb:221:in
 `execute'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/mercenary-0.4.0/lib/mercenary/program.rb:44:in
 `go'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/mercenary-0.4.0/lib/mercenary.rb:21:in
 `program'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/gems/jekyll-4.4.1/exe/jekyll:15:in
 `<top (required)>'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/bin/jekyll:25:in `load'
        from 
/__w/spark/spark/docs/.local_ruby_bundle/ruby/3.0.0/bin/jekyll:25:in `<main>'
   Error: Process completed with exit code 1.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to