Martin Bode created SPARK-44027:
-----------------------------------

             Summary: create *permanent* Spark View from DataFrame via PySpark 
& Scala DataFrame API
                 Key: SPARK-44027
                 URL: https://issues.apache.org/jira/browse/SPARK-44027
             Project: Spark
          Issue Type: New Feature
          Components: PySpark
    Affects Versions: 3.4.0
            Reporter: Martin Bode


currently only *_temporary_ Spark Views* can be created from a DataFrame:
 * 
[DataFrame.createGlobalTempView|https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.createGlobalTempView.html#pyspark.sql.DataFrame.createGlobalTempView]
 * 
[DataFrame.createOrReplaceGlobalTempView|https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.createOrReplaceGlobalTempView.html#pyspark.sql.DataFrame.createOrReplaceGlobalTempView]
 * 
[DataFrame.createOrReplaceTempView|https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.createOrReplaceTempView.html#pyspark.sql.DataFrame.createOrReplaceTempView]
 * 
[DataFrame.createTempView|https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.createTempView.html#pyspark.sql.DataFrame.createTempView]

When a user needs a _*permanent*_ *Spark View* he has to fall back to Spark SQL 
({{{}CREATE VIEW AS SELECT...{}}}).

Sometimes it is easier and more readable to specify the desired logic of the 
view through {_}Scala/PySpark DataFrame API{_}.
Therefore, I'd like to suggest to implement a new PySpark method that allows 
creating a _*permanent*_ *Spark View* from a DataFrame (e.g. 
{{{}DataFrame.createOrReplaceView{}}}).

see also:
[https://community.databricks.com/s/question/0D53f00001PANVgCAP/is-there-a-way-to-create-a-nontemporary-spark-view-with-pyspark]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to