Ruifeng Zheng created SPARK-44406:
-------------------------------------

             Summary: DataFrame depending on temp view fail after the view is 
drop
                 Key: SPARK-44406
                 URL: https://issues.apache.org/jira/browse/SPARK-44406
             Project: Spark
          Issue Type: Bug
          Components: Connect
    Affects Versions: 3.5.0
            Reporter: Ruifeng Zheng


In vanilla Spark:


{code:java}
In [1]: df = spark.createDataFrame([(1, 4), (2, 4), (3, 6)], ["A", "B"])

In [2]: df.createOrReplaceTempView("t")

In [3]: df2 = spark.sql("select * from t")

In [4]: df2.show()
+---+---+                                                                       
|  A|  B|
+---+---+
|  1|  4|
|  2|  4|
|  3|  6|
+---+---+


In [5]: spark.catalog.dropTempView("t")
Out[5]: True

In [6]: df2.show()
+---+---+
|  A|  B|
+---+---+
|  1|  4|
|  2|  4|
|  3|  6|
+---+---+
{code}


In Spark Connect:


{code:java}
In [1]: df = spark.createDataFrame([(1, 4), (2, 4), (3, 6)], ["A", "B"])

In [2]: df.createOrReplaceTempView("t")

In [3]: df2 = spark.sql("select * from t")

In [4]: df2.show()
+---+---+
|  A|  B|
+---+---+
|  1|  4|
|  2|  4|
|  3|  6|
+---+---+


In [5]: spark.catalog.dropTempView("t")
Out[5]: True

In [6]: df2.show()
23/07/13 11:57:18 ERROR SparkConnectService: Error during: execute. UserId: 
ruifeng.zheng. SessionId: 1fc234fd-07da-4ad0-9ec5-2d818cef6033.
org.apache.spark.sql.AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or 
view `t` cannot be found. Verify the spelling and correctness of the schema and 
catalog.
If you did not qualify the name with a schema, verify the current_schema() 
output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.; 
line 1 pos 14;
'Project [*]
+- 'UnresolvedRelation [t], [], false


{code}




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to