I use queryPlan.queryExecution.analyzed to get the logical plan.
it works.
And What you explained to me is very useful.
Thank you very much.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark1-0-1-catalyst-transform-filter-not-push-down-tp9599p9689.ht
Hi,
queryPlan.baseLogicalPlan is not the plan used to execution. Actually,
the baseLogicalPlan
of a SchemaRDD (queryPlan in your case) is just the parsed plan (the parsed
plan will be analyzed, and then optimized. Finally, a physical plan will be
created). The plan shows up after you execute "val
Hi, I encountered a weird problem in spark sql.
I use sbt/sbt hive/console to go into the shell.
I test the filter push down by using catalyst.
scala> val queryPlan = sql("select value from (select key,value from src)a
where a.key=86 ")
scala> queryPlan.baseLogicalPlan
res0: org.apache.spark.sq