Andreas Maier created SPARK-22369:
-------------------------------------

             Summary: PySpark: Document methods of spark.catalog interface
                 Key: SPARK-22369
                 URL: https://issues.apache.org/jira/browse/SPARK-22369
             Project: Spark
          Issue Type: Documentation
          Components: PySpark
    Affects Versions: 2.2.0
            Reporter: Andreas Maier


The following methods from the {{spark.catalog}} interface are not documented.

{code:java}
$ pyspark
>>> dir(spark.catalog)
['__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', 
'__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', 
'__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', 
'__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', 
'__str__', '__subclasshook__', '__weakref__', '_jcatalog', '_jsparkSession', 
'_reset', '_sparkSession', 'cacheTable', 'clearCache', 'createExternalTable', 
'createTable', 'currentDatabase', 'dropGlobalTempView', 'dropTempView', 
'isCached', 'listColumns', 'listDatabases', 'listFunctions', 'listTables', 
'recoverPartitions', 'refreshByPath', 'refreshTable', 'registerFunction', 
'setCurrentDatabase', 'uncacheTable']
{code}
As a user I would like to have these methods documented on 
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html . Old methods 
of the SQLContext (e.g. {{pyspark.sql.SQLContext.cacheTable()}} vs. 
{{pyspark.sql.SparkSession.catalog.cacheTable()}} or 
{{pyspark.sql.HiveContext.refreshTable()}} vs. 
{{pyspark.sql.SparkSession.catalog.refreshTable()}} ) should point to the new 
method. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to