My basic test is here - https://github.com/rohitkapoor1/sparkPushDownAggregate
From: German Schiavon
Date: Thursday, 4 November 2021 at 2:17 AM
To: huaxin gao
Cc: Kapoor, Rohit , user@spark.apache.org
Subject: Re: [Spark SQL]: Aggregate Push Down / Spark 3.2
EXTERNAL MAIL: USE CAUTION BEFORE
Thanks for your guidance Huaxin. I have been able to test the push down
operators successfully against Postgresql using DS v2.
From: huaxin gao
Date: Tuesday, 2 November 2021 at 12:35 AM
To: Kapoor, Rohit
Subject: Re: [Spark SQL]: Aggregate Push Down / Spark 3.2
EXTERNAL MAIL: USE CAUTION
Hi Huaxin,
Thanks a lot for your response. Do I need to write a custom data source reader
(in my case, for PostgreSql) using the Spark DS v2 APIs, instead of the
standard spark.read.format(“jdbc”) ?
Thanks,
Rohit
From: huaxin gao
Date: Monday, 1 November 2021 at 11:32 PM
To: Kapoor, Rohit
Hi,
I am testing the aggregate push down for JDBC after going through the JIRA -
https://issues.apache.org/jira/browse/SPARK-34952
I have the latest Spark 3.2 setup in local mode (laptop).
I have PostgreSQL v14 locally on my laptop. I am trying a basic aggregate query
on “emp” table that has