[ 
https://issues.apache.org/jira/browse/SPARK-2890?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14093746#comment-14093746
 ] 

Jianshi Huang commented on SPARK-2890:
--------------------------------------

My use case:

The result will be parsed into (id, type, start, end, properties) tuples. 
Properties might or might not contain any of (id, type, start end). So it's 
easier just to list them at the end and not to worry about duplicated names.

Jianshi

> Spark SQL should allow SELECT with duplicated columns
> -----------------------------------------------------
>
>                 Key: SPARK-2890
>                 URL: https://issues.apache.org/jira/browse/SPARK-2890
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: Jianshi Huang
>
> Spark reported error java.lang.IllegalArgumentException with messages:
> java.lang.IllegalArgumentException: requirement failed: Found fields with the 
> same name.
>         at scala.Predef$.require(Predef.scala:233)
>         at 
> org.apache.spark.sql.catalyst.types.StructType.<init>(dataTypes.scala:317)
>         at 
> org.apache.spark.sql.catalyst.types.StructType$.fromAttributes(dataTypes.scala:310)
>         at 
> org.apache.spark.sql.parquet.ParquetTypesConverter$.convertToString(ParquetTypes.scala:306)
>         at 
> org.apache.spark.sql.parquet.ParquetTableScan.execute(ParquetTableOperations.scala:83)
>         at 
> org.apache.spark.sql.execution.Filter.execute(basicOperators.scala:57)
>         at 
> org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:85)
>         at org.apache.spark.sql.SchemaRDD.collect(SchemaRDD.scala:433)
> After trial and error, it seems it's caused by duplicated columns in my 
> select clause.
> I made the duplication on purpose for my code to parse correctly. I think we 
> should allow users to specify duplicated columns as return value.
> Jianshi



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to