BTW. Even for json a pushdown can make sense to avoid that data is unnecessary
ending in Spark ( because it would cause unnecessary overhead).
In the datasource v2 api you need to implement a SupportsPushDownFilter
> Am 08.12.2018 um 10:50 schrieb Noritaka Sekiyama :
>
> Hi,
>
> I'm a support
It was already available before DataSourceV2, but I think it might have been an
internal/semi-official API (eg json is an internal datasource since some time
now). The filters were provided to the datasource, but you will never know if
the datasource has indeed leveraged them or if for other
Hi,
I'm a support engineer, interested in DataSourceV2.
Recently I had some pain to troubleshoot to check if pushdown is actually
applied or not.
I noticed that DataFrame's explain() method shows pushdown even for JSON.
It totally depends on DataSource side, I believe. However, I would like