[ https://issues.apache.org/jira/browse/SPARK-29038?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16927258#comment-16927258 ]
Dilip Biswal edited comment on SPARK-29038 at 9/11/19 5:13 AM: --------------------------------------------------------------- [~cltlfcjin] Actually i had similar question as [~mgaido]. We have been writing the SQL reference for 3.0 and have recently documented {code:java} CACHE TABLE {code} in [https://github.com/apache/spark/pull/25532]. So in SPARK, it is possible to cache the result of a complex query involving joins, aggregates etc, right ? was (Author: dkbiswal): [~cltlfcjin] Actually i had similar question as [~mgaido]. We have been writing the SQL reference for 3.0 and have recently documented {code:java} CACHE TABLE {code} in [https://github.com/apache/spark/pull/25532]. So in SPARK, it is possible to cache the result of a complex query involving joins, aggregates etc. > SPIP: Support Spark Materialized View > ------------------------------------- > > Key: SPARK-29038 > URL: https://issues.apache.org/jira/browse/SPARK-29038 > Project: Spark > Issue Type: New Feature > Components: SQL > Affects Versions: 3.0.0 > Reporter: Lantao Jin > Priority: Major > > Materialized view is an important approach in DBMS to cache data to > accelerate queries. By creating a materialized view through SQL, the data > that can be cached is very flexible, and needs to be configured arbitrarily > according to specific usage scenarios. The Materialization Manager > automatically updates the cache data according to changes in detail source > tables, simplifying user work. When user submit query, Spark optimizer > rewrites the execution plan based on the available materialized view to > determine the optimal execution plan. > Details in [design > doc|https://docs.google.com/document/d/1q5pjSWoTNVc9zsAfbNzJ-guHyVwPsEroIEP8Cca179A/edit?usp=sharing] -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org