Michael – Thanks for the response – that’s right, I haven’t noticed that Spark Shell instantiates sqlContext as a HiveContext, not actual Spark SQL Context… I’ve seen the PR to add STDDEV to data frames.. Can I expect this to be added to Spark SQL in Spark 1.4 or it’s still uncertain? It would be really helpful to know in order to understand if I have to change existing code to use HiveContext instead of SQLContext (which would be undesired)… Thanks!
From: Michael Armbrust [mailto:mich...@databricks.com] Sent: Saturday, May 09, 2015 11:32 AM To: Oleg Shirokikh Cc: user Subject: Re: Spark SQL: STDDEV working in Spark Shell but not in a standalone app Are you perhaps using a HiveContext in the shell but a SQLContext in your app? I don't think we natively implement stddev until 1.4.0 On Fri, May 8, 2015 at 4:44 PM, barmaley <o...@solver.com<mailto:o...@solver.com>> wrote: Given a registered table from data frame, I'm able to execute queries like sqlContext.sql("SELECT STDDEV(col1) FROM table") from Spark Shell just fine. However, when I run exactly the same code in a standalone app on a cluster, it throws an exception: "java.util.NoSuchElementException: key not found: STDDEV"... Is STDDEV ia among default functions in Spark SQL? I'd appreciate if you could comment what's going on with the above. Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-STDDEV-working-in-Spark-Shell-but-not-in-a-standalone-app-tp22825.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org> For additional commands, e-mail: user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>