DavidToneian commented on a change in pull request #31401:
URL: https://github.com/apache/spark/pull/31401#discussion_r567408146



##########
File path: python/pyspark/sql/avro/functions.py
##########
@@ -37,7 +37,7 @@ def from_avro(data, jsonFormatSchema, options=None):
 
     Parameters
     ----------
-    data : :class:`Column` or str
+    data : :class:`pyspark.sql.Column` or str

Review comment:
       In both cases you referenced, the docstring is part of a class in the 
same module as `Column`, i.e. `pyspark.sql`, which is why the name lookup 
works. Here, though, we're in child modules, `pyspark.sql.avro.functions` and 
`pyspark.sql.functions`.
   
   I agree that having to change references to `Column` is a nuisance that I 
too would like to avoid, but last time I checked, I found no way to make Sphinx 
render as intended by setting a configuration option of some kind.
   
   There are two variations, though, that could be used:
   
   - One could say ``:class:`.Column` `` (note the dot before `Column`) to make 
Sphinx look for `Column` in all modules. This still requires changes to 
numerous places in the docstrings, and may cause ambiguity in case the 
referenced name is not unique.
   - Also, one can use the `~` prefix, as in ``:class:`~pyspark.sql.Column` `` 
to make the rendered text say `Column`, rather than `pyspark.sql.Column`, but 
still link to the correct place. Which output is preferable is a matter of 
taste, I guess, and I'm indifferent.
   
   If someone knows a better way to fix this, I'd be very happy to hear about 
it!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to