Igor Ngouagna created SPARK-27203:
-------------------------------------

             Summary: Fails to read a view using CTE (WITH clause) and created 
via beeline 
                 Key: SPARK-27203
                 URL: https://issues.apache.org/jira/browse/SPARK-27203
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.1
         Environment: *Spark: 2.1.1*

*Beeline: 1.2.1000*
            Reporter: Igor Ngouagna


Spark fails when trying to read a view which code involve CTE, and which is 
created via beeline.

For example, considering the following view, created via Beeline:
{code:sql}
create view db.test as 
with q1 as (select 1 as n)
select n from q1
{code}
When you do
{code:java}
spark.sql("select * from db.test").show()
{code}
The output is like
{code}
'Table or view not found: q1; line 2 pos 14'
Traceback (most recent call last):
  File 
"/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/pyspark.zip/pyspark/sql/session.py",
 line 545, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File 
"/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py",
 line 1133, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File 
"/DATA/fs11/hadoop/yarn/local/usercache/ingouagn/appcache/application_1552973526615_3878/container_e380_1552973526615_3878_01_000001/pyspark.zip/pyspark/sql/utils.py",
 line 69, in deco
    raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: 'Table or view not found: q1; line 2 pos 
14'
{code}
 

*Spark: 2.1.1*

*Beeline: 1.2.1000*

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to