[jira] [Updated] (SPARK-8007) Support resolving virtual columns in DataFrames
[ https://issues.apache.org/jira/browse/SPARK-8007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Reynold Xin updated SPARK-8007: --- Assignee: Joseph Batchik > Support resolving virtual columns in DataFrames > --- > > Key: SPARK-8007 > URL: https://issues.apache.org/jira/browse/SPARK-8007 > Project: Spark > Issue Type: Sub-task > Components: SQL >Reporter: Reynold Xin >Assignee: Joseph Batchik > > Create the infrastructure so we can resolve df("SPARK__PARTITION__ID") to > SparkPartitionID expression. > A cool use case is to understand physical data skew: > {code} > df.groupBy("SPARK__PARTITION__ID").count() > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-8007) Support resolving virtual columns in DataFrames
[ https://issues.apache.org/jira/browse/SPARK-8007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Reynold Xin updated SPARK-8007: --- Description: Create the infrastructure so we can resolve df("SPARK__PARTITION__ID") to SparkPartitionID expression. A cool use case is to understand physical data skew: {code} df.groupBy("SPARK__PARTITION__ID").count() {code} was:Create the infrastructure so we can resolve df("SPARK_PARTITION__ID") to SparkPartitionID expression. > Support resolving virtual columns in DataFrames > --- > > Key: SPARK-8007 > URL: https://issues.apache.org/jira/browse/SPARK-8007 > Project: Spark > Issue Type: Sub-task > Components: SQL >Reporter: Reynold Xin > > Create the infrastructure so we can resolve df("SPARK__PARTITION__ID") to > SparkPartitionID expression. > A cool use case is to understand physical data skew: > {code} > df.groupBy("SPARK__PARTITION__ID").count() > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org