[ https://issues.apache.org/jira/browse/SPARK-17458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Herman van Hovell resolved SPARK-17458. --------------------------------------- Resolution: Fixed Fix Version/s: 2.1.0 I cannot find the JIRA username of the assignee (Andrew Ray). I'd would be great if someone can provide this. > Alias specified for aggregates in a pivot are not honored > --------------------------------------------------------- > > Key: SPARK-17458 > URL: https://issues.apache.org/jira/browse/SPARK-17458 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Ravi Somepalli > Fix For: 2.1.0 > > > When using pivot and multiple aggregations we need to alias to avoid special > characters, but alias does not help because > df.groupBy("C").pivot("A").agg(avg("D").as("COLD"), max("B").as("COLB")).show > || C || bar_avg(`D`) AS `COLD` || bar_max(`B`) AS `COLB` || foo_avg(`D`) > AS `COLD` || foo_max(`B`) AS `COLB` || > |small| 5.5| two| 2.3333333333333335| > two| > |large| 5.5| two| 2.0| > one| > Expected Output > || C || bar_COLD || bar_COLB || foo_COLD || foo_COLB || > |small| 5.5| two| 2.3333333333333335| > two| > |large| 5.5| two| 2.0| > one| > One approach you can fix this issue is to change the class > sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala > and change the outputName method in > {code} > object ResolvePivot extends Rule[LogicalPlan] { > def apply(plan: LogicalPlan): LogicalPlan = plan transform { > {code} > {code} > def outputName(value: Literal, aggregate: Expression): String = { > val suffix = aggregate match { > case n: NamedExpression => > aggregate.asInstanceOf[NamedExpression].name > case _ => aggregate.sql > } > if (singleAgg) value.toString else value + "_" + suffix > } > {code} > Version : 2.0.0 > {code} > def outputName(value: Literal, aggregate: Expression): String = { > if (singleAgg) value.toString else value + "_" + aggregate.sql > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org