[ https://issues.apache.org/jira/browse/SPARK-16041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-16041: ------------------------------------ Assignee: (was: Apache Spark) > Disallow Duplicate Columns in `partitionBy`, `blockBy` and `sortBy` in > DataFrameWriter > -------------------------------------------------------------------------------------- > > Key: SPARK-16041 > URL: https://issues.apache.org/jira/browse/SPARK-16041 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Xiao Li > > Duplicate columns are not allowed in `partitionBy`, `blockBy`, `sortBy` in > DataFrameWriter. The duplicate columns could cause unpredictable results. For > example, the resolution failure. > We should detect the duplicates and issue exceptions with appropriate > messages. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org