I understand that the following are equivalent
df.filter('account === "acct1")
sql("select * from tempTableName where account = 'acct1'")
But is Spark SQL "smart" to also push filter predicates down for the
initial load?
e.g.
sqlContext.read.jdbc(…).filter('account=== "acct1")
When you have following query, 'account=== “acct1” will be pushdown to generate
new query with “where account = acct1”
Thanks.
Zhan Zhang
On Nov 18, 2015, at 11:36 AM, Eran Medan
> wrote:
I understand that the following are equivalent