[ https://issues.apache.org/jira/browse/SPARK-36574?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
jiaan.geng updated SPARK-36574: ------------------------------- Description: Spark SQL includes a data source that can read data from other databases using JDBC. Spark also supports the case-insensitive option pushDownPredicate. According to http://spark.apache.org/docs/latest/sql-data-sources-jdbc.html, If set pushDownPredicate to false, no filter will be pushed down to the JDBC data source and thus all filters will be handled by Spark. But I find it still be pushed down to data scoure. was: Spark SQL also includes a data source that can read data from other databases using JDBC. Spark also supports the case-insensitive option pushDownPredicate. According to http://spark.apache.org/docs/latest/sql-data-sources-jdbc.html, If set pushDownPredicate to false, no filter will be pushed down to the JDBC data source and thus all filters will be handled by Spark. But I find it still be pushed down to data scoure. > pushDownPredicate failed to prevent push filters down to the data source. > ------------------------------------------------------------------------- > > Key: SPARK-36574 > URL: https://issues.apache.org/jira/browse/SPARK-36574 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.2.0 > Reporter: jiaan.geng > Priority: Major > > Spark SQL includes a data source that can read data from other databases > using JDBC. > Spark also supports the case-insensitive option pushDownPredicate. > According to http://spark.apache.org/docs/latest/sql-data-sources-jdbc.html, > If set pushDownPredicate to false, no filter will be pushed down to the JDBC > data source and thus all filters will be handled by Spark. > But I find it still be pushed down to data scoure. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org