[ 
https://issues.apache.org/jira/browse/SPARK-20456?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Patterson updated SPARK-20456:
--------------------------------------
    Description: 
Document sql.functions.py:

1. Add examples for the common string functions (upper, lower, and reverse)
2. Rename columns in datetime examples to be more informative (e.g. from 'd' to 
'date')
3. Add examples for unix_timestamp, from_unixtime, rand, randn, collect_list, 
collect_set, lit, 
4. Add note to all trigonometry functions that units are radians.
5. Add links between functions, (e.g. add link to radians from toRadians)

  was:
Document `sql.functions.py`:

1. Add examples for the common aggregate functions (`min`, `max`, `mean`, 
`count`, `collect_set`, `collect_list`, `stddev`, `variance`)
2. Rename columns in datetime examples.
3. Add examples for `unix_timestamp` and `from_unixtime`
4. Add note to all trigonometry functions that units are radians.
5. Add example for `lit`


> Add examples for functions collection for pyspark
> -------------------------------------------------
>
>                 Key: SPARK-20456
>                 URL: https://issues.apache.org/jira/browse/SPARK-20456
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation, PySpark
>    Affects Versions: 2.1.0
>            Reporter: Michael Patterson
>            Priority: Minor
>
> Document sql.functions.py:
> 1. Add examples for the common string functions (upper, lower, and reverse)
> 2. Rename columns in datetime examples to be more informative (e.g. from 'd' 
> to 'date')
> 3. Add examples for unix_timestamp, from_unixtime, rand, randn, collect_list, 
> collect_set, lit, 
> 4. Add note to all trigonometry functions that units are radians.
> 5. Add links between functions, (e.g. add link to radians from toRadians)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to