[ https://issues.apache.org/jira/browse/SPARK-15229?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin updated SPARK-15229: -------------------------------- Description: Our case sensitivity support is different from what ANSI SQL standards support. Postgres' behavior is that if an identifier is quoted, then it is treated as case sensitive, otherwise it is folded to lower case. We will likely need to revisit this in the future and change our behavior. For now, the safest change to do for Spark 2.0 is to make the case sensitive option internal and discourage users from turning it on, effectively making Spark always case insensitive. was: Our case sensitivity support is different from what ANSI SQL standards support. Postgres' behavior is that if an identifier is quoted, then it is treated as case sensitive, otherwise it is folded to lower case. We will likely need to revisit this in the future and change our behavior. For now, the safest change to do for Spark 2.0 is to make the case sensitive option internal and recommend users from not turning it on, effectively making Spark always case insensitive. > Make case sensitivity setting internal > -------------------------------------- > > Key: SPARK-15229 > URL: https://issues.apache.org/jira/browse/SPARK-15229 > Project: Spark > Issue Type: Improvement > Components: SQL > Reporter: Reynold Xin > Assignee: Reynold Xin > > Our case sensitivity support is different from what ANSI SQL standards > support. Postgres' behavior is that if an identifier is quoted, then it is > treated as case sensitive, otherwise it is folded to lower case. We will > likely need to revisit this in the future and change our behavior. For now, > the safest change to do for Spark 2.0 is to make the case sensitive option > internal and discourage users from turning it on, effectively making Spark > always case insensitive. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org