[ https://issues.apache.org/jira/browse/SPARK-41741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-41741: ------------------------------------ Assignee: (was: Apache Spark) > [SQL] ParquetFilters StringStartsWith push down matching string do not use > UTF-8 > -------------------------------------------------------------------------------- > > Key: SPARK-41741 > URL: https://issues.apache.org/jira/browse/SPARK-41741 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.4.0 > Reporter: Jiale He > Priority: Major > Attachments: image-2022-12-28-18-00-00-861.png, > image-2022-12-28-18-00-21-586.png, image-2023-01-09-11-10-31-262.png, > image-2023-01-09-18-27-53-479.png, > part-00000-30432312-7cdb-43ef-befe-93bcfd174878-c000.snappy.parquet > > > Hello ~ > > I found a problem, but there are two ways to solve it. > > The parquet filter is pushed down. When using the like '***%' statement to > query, if the system default encoding is not UTF-8, it may cause an error. > > There are two ways to bypass this problem as far as I know > 1. spark.executor.extraJavaOptions="-Dfile.encoding=UTF-8" > 2. spark.sql.parquet.filterPushdown.string.startsWith=false > > The following is the information to reproduce this problem > The parquet sample file is in the attachment > {code:java} > spark.read.parquet("file:///home/kylin/hjldir/part-00000-30432312-7cdb-43ef-befe-93bcfd174878-c000.snappy.parquet").createTempView("tmp”) > spark.sql("select * from tmp where `1` like '啦啦乐乐%'").show(false) {code} > !image-2022-12-28-18-00-00-861.png|width=879,height=430! > > !image-2022-12-28-18-00-21-586.png|width=799,height=731! > > I think the correct code should be: > {code:java} > private val strToBinary = > Binary.fromReusedByteArray(v.getBytes(StandardCharsets.UTF_8)) {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org