[ https://issues.apache.org/jira/browse/SPARK-48824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-48824. --------------------------------- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 47614 [https://github.com/apache/spark/pull/47614] > Add SQL syntax in create/replace table to create an identity column > ------------------------------------------------------------------- > > Key: SPARK-48824 > URL: https://issues.apache.org/jira/browse/SPARK-48824 > Project: Spark > Issue Type: New Feature > Components: SQL > Affects Versions: 4.0.0 > Reporter: Carmen Kwan > Assignee: Carmen Kwan > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > Add SQL support for creating identity columns. Identity Column syntax should > be flexible such that users can specify > * whether identity values are always generated by the system > * (optionally) the starting value of the column > * (optionally) the increment/step of the column > The SQL syntax support should also allow flexible ordering of the increment > and starting values, as both variants are used in the wild by other systems > (e.g. > [PostgreSQL|https://www.postgresql.org/docs/current/sql-createsequence.html] > [Oracle).|https://docs.oracle.com/en/database/oracle/oracle-database/23/sqlrf/CREATE-SEQUENCE.html#GUID-E9C78A8C-615A-4757-B2A8-5E6EFB130571] > That is, we should allow both > {code:java} > START WITH <start> INCREMENT BY <step>{code} > and > {code:java} > INCREMENT BY <step> START WITH <start>{code} > . > For example, we should be able to define > {code:java} > CREATE TABLE default.example ( > id LONG GENERATED ALWAYS AS IDENTITY, > id2 LONG GENERATED BY DEFAULT START WITH 0 INCREMENT BY -10, > id3 LONG GENERATED ALWAYS AS IDENTITY INCREMENT BY 2 START WITH -8, > value LONG > ) > {code} > This will enable defining identity columns in Spark SQL for data sources that > support it. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org