Hi, I am using spark 1.6.0
I want to find standard deviation of columns that will come dynamically. val stdDevOnAll = columnNames.map { x => stddev(x } causalDf.groupBy(causalDf("A"),causalDf("B"),causalDf("C")) .agg(stdDevOnAll:_*) //error line I am trying to do as above. But it is giving me compilation error as below. overloaded method value agg with alternatives: (expr: org.apache.spark.sql.Column,exprs: org.apache.spark.sql.Column*)org.apache.spark.sql.DataFrame <and> (exprs: java.util.Map[String,String])org.apache.spark.sql.DataFrame <and> (exprs: scala.collection.immutable.Map[String,String])org.apache.spark.sql.DataFrame <and> (aggExpr: (String, String),aggExprs: (String, String)*)org.apache.spark.sql.DataFrame cannot be applied to (org.apache.spark.sql.Column) Naveen