[ https://issues.apache.org/jira/browse/SPARK-42919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Serge Rielau updated SPARK-42919: --------------------------------- Description: SparkSQL supports *regex_column_names.* [https://spark.apache.org/docs/latest/sql-ref-syntax-qry-select.html] However, support depends on a config: spark.sql.parser.quotedRegexColumnNames The reason is that it overloads proper identifier names. Here we propose a cleaner, compatible API: SELECT * LIKE 'pattern' ... The semantic should follow common regular expression patterns used for the LIKE operator with the caveat that it should obey identifier case insensitivity setting. was: SparkSQL supports *regex_column_names.* [https://spark.apache.org/docs/latest/sql-ref-syntax-qry-select.html] However, support depends on a config: spark.sql.parser.quotedRegexColumnNames The reason is that it overloads proper identifier names. Here we propose a cleaner, compatible API: SELECT * LIKE 'pattern'; The semantic should follow common regular expression patterns used for the LIKE operator with the caveat that it should obey identifier case insensitivity setting. > SELECT * LIKE 'pattern' FROM .... > --------------------------------- > > Key: SPARK-42919 > URL: https://issues.apache.org/jira/browse/SPARK-42919 > Project: Spark > Issue Type: New Feature > Components: Spark Core > Affects Versions: 3.5.0 > Reporter: Serge Rielau > Priority: Minor > > SparkSQL supports *regex_column_names.* > [https://spark.apache.org/docs/latest/sql-ref-syntax-qry-select.html] > However, support depends on a config: spark.sql.parser.quotedRegexColumnNames > The reason is that it overloads proper identifier names. > Here we propose a cleaner, compatible API: > SELECT * LIKE 'pattern' ... > The semantic should follow common regular expression patterns used for the > LIKE operator with the caveat that it should obey identifier case > insensitivity setting. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org