[ https://issues.apache.org/jira/browse/SPARK-18736?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Herman van Hovell updated SPARK-18736: -------------------------------------- Description: Spark-Sql, {{CreateMap}} does not enforce unique keys, i.e. it's possible to create a map with two identical keys: {noformat} CreateMap(Literal(1), Literal(11), Literal(1), Literal(12)) {noformat} This does not behave like standard maps in common programming languages. proper behavior should be chosen: # first 'wins' # last 'wins' # runtime error. {{GetMapValue}} currently implements option #1. Even if this is the desired behavior {{CreateMap}} should return a unique map. was: Spark-Sql, CreateMap does not enforce unique keys, i.e. it's possible to create a map with two identical keys: CreateMap(Literal(1), Literal(11), Literal(1), Literal(12)) This does not behave like standard maps in common programming languages. proper behavior should be chosen" 1. first 'wins' 2. last 'wins' 3. runtime error. * currently GetMapValue implements option #1. even if this is the desired behavior CreateMap should return a unique map. > CreateMap allows non-unique keys > -------------------------------- > > Key: SPARK-18736 > URL: https://issues.apache.org/jira/browse/SPARK-18736 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Eyal Farago > Labels: map, sql, types > > Spark-Sql, {{CreateMap}} does not enforce unique keys, i.e. it's possible to > create a map with two identical keys: > {noformat} > CreateMap(Literal(1), Literal(11), Literal(1), Literal(12)) > {noformat} > This does not behave like standard maps in common programming languages. > proper behavior should be chosen: > # first 'wins' > # last 'wins' > # runtime error. > {{GetMapValue}} currently implements option #1. Even if this is the desired > behavior {{CreateMap}} should return a unique map. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org