On 18/02/2016 11:47, Mich Talebzadeh wrote: 

> It is available in Hive as well 
> 
> You can of course write your own standard deviation function 
> 
> For example sttdev for column amount_sold cann be expressed as 
> 
> SQRT((SUM(POWER(amount_sold,2))-(COUNT(1)*POWER(AVG(amount_sold),2)))/(COUNT(1)-1))
>  
> 
> COUNT(1) is the total records in the sample/table. 
> 
> HTH 
> 
> On 18/02/2016 11:03, rok wrote: 
> There is a stddev function since 1.6: 
> http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.functions.stddev
>  [1] 
> 
> If you are using spark < 1.6 you can write your own more or less easily. 
> 
> On Wed, Feb 17, 2016 at 5:06 PM, mayx [via Apache Spark User List] <[hidden 
> email]> wrote:
> I'd like to use standard deviation over window partitions on the Spark 
> dataframe, but it didn't work. Is it not supported? Looks like it supports 
> many aggregation functions, such as mean, min, etc. How can I make a feature 
> request for this? 
> 
> -------------------------
> 
> If you reply to this email, your message will be added to the discussion 
> below: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-stddev-not-a-supported-aggregation-function-in-SparkSQL-WindowSpec-tp26250.html
>  [2] 
> To start a new topic under Apache Spark User List, email [hidden email] 
> To unsubscribe from Apache Spark User List, click here.
> NAML [3] 
> 
> -------------------------
> View this message in context: Re: Is stddev not a supported aggregation 
> function in SparkSQL WindowSpec? [4]
> Sent from the Apache Spark User List mailing list archive [5] at Nabble.com.

-- 

Dr Mich Talebzadeh

LinkedIn
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

http://talebzadehmich.wordpress.com

NOTE: The information in this email is proprietary and confidential.
This message is for the designated recipient only, if you are not the
intended recipient, you should destroy it immediately. Any information
in this message shall not be understood as given or endorsed by Cloud
Technology Partners Ltd, its subsidiaries or their employees, unless
expressly so stated. It is the responsibility of the recipient to ensure
that this email is virus free, therefore neither Cloud Technology
partners Ltd, its subsidiaries nor their employees accept any
responsibility.

-- 

Dr Mich Talebzadeh

LinkedIn
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

http://talebzadehmich.wordpress.com

NOTE: The information in this email is proprietary and confidential.
This message is for the designated recipient only, if you are not the
intended recipient, you should destroy it immediately. Any information
in this message shall not be understood as given or endorsed by Cloud
Technology Partners Ltd, its subsidiaries or their employees, unless
expressly so stated. It is the responsibility of the recipient to ensure
that this email is virus free, therefore neither Cloud Technology
partners Ltd, its subsidiaries nor their employees accept any
responsibility.

 

Links:
------
[1]
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.functions.stddev
[2]
http://apache-spark-user-list.1001560.n3.nabble.com/Is-stddev-not-a-supported-aggregation-function-in-SparkSQL-WindowSpec-tp26250.html
[3]
http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&amp;id=instant_html%21nabble%3Aemail.naml&amp;base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&amp;breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml
[4]
http://apache-spark-user-list.1001560.n3.nabble.com/Is-stddev-not-a-supported-aggregation-function-in-SparkSQL-WindowSpec-tp26250p26263.html
[5] http://apache-spark-user-list.1001560.n3.nabble.com/

Reply via email to