[ https://issues.apache.org/jira/browse/SPARK-31210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17063742#comment-17063742 ]
jiaan.geng edited comment on SPARK-31210 at 3/21/20, 9:56 AM: -------------------------------------------------------------- {code:java} spark-sql> create or replace temporary view test_table_like as SELECT * FROM VALUES ('100 times'), ('1000 times'), ('100%') as test_table_like (subject); Time taken: 0.143 seconds spark-sql> select * from test_table_like where subject like '100^%' escape '^'; 100% Time taken: 0.132 seconds, Fetched 1 row(s) {code} was (Author: beliefer): spark-sql> create or replace temporary view test_table_like as SELECT * FROM VALUES ('100 times'), ('1000 times'), ('100%') as test_table_like (subject); Time taken: 0.143 seconds spark-sql> select * from test_table_like where subject like '100^%' escape '^'; 100% Time taken: 0.132 seconds, Fetched 1 row(s) > An issue for Spark SQL LIKE-with-ESCAPE clause > ---------------------------------------------- > > Key: SPARK-31210 > URL: https://issues.apache.org/jira/browse/SPARK-31210 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Mingli Rui > Priority: Major > > I try to use LIKE with ESCAPE for Spark 3.0.0-preview2. But I find in it > doesn't work in below cases. > The database table > ============== > create or replace table test_table_like ( subject string) > insert into $test_table_like values ('100 times'), ('1000 times'), ('100%') > > Repro > ==== > val result2 = sparkSession.sql( > s"select * from test_table_like where subject like '100^%' escape '^' order > by 1") > "100%" is expected to returned, but it doesn't. I debug into the code to > check the logical plan. > In the logical plan, the LIKE is transformed as "StartsWith(subject#130, > 100^)". It looks it is incorrect. > > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org