Thanks for your reply.

Creating a table is an option, but such approach slows down reads & writes for 
a real-time analytics streaming use case that I’m currently working on.
If at all global temporary view could have been accessible across 
sessions/spark contexts, that would have simplified my usecase a lot.

But yeah, thanks for explaining the behavior of global temporary view, now it’s 
clear ☺

-Hemanth

From: Felix Cheung <felixcheun...@hotmail.com>
Date: Saturday, 22 April 2017 at 11.05
To: Hemanth Gudela <hemanth.gud...@qvantel.com>, "user@spark.apache.org" 
<user@spark.apache.org>
Subject: Re: Spark SQL - Global Temporary View is not behaving as expected

Cross session is this context is multiple spark sessions from the same spark 
context. Since you are running two shells, you are having different spark 
context.

Do you have to you a temp view? Could you create a table?

_____________________________
From: Hemanth Gudela 
<hemanth.gud...@qvantel.com<mailto:hemanth.gud...@qvantel.com>>
Sent: Saturday, April 22, 2017 12:57 AM
Subject: Spark SQL - Global Temporary View is not behaving as expected
To: <user@spark.apache.org<mailto:user@spark.apache.org>>



Hi,

According to 
documentation<http://spark.apache.org/docs/latest/sql-programming-guide.html#global-temporary-view>,
 global temporary views are cross-session accessible.

But when I try to query a global temporary view from another spark shell like 
this-->
Instance 1 of spark-shell
----------------------------------
scala> spark.sql("select 1 as col1").createGlobalTempView("gView1")

Instance 2 of spark-shell (while Instance 1 of spark-shell is still alive)
---------------------------------
scala> spark.sql("select * from global_temp.gView1").show()
org.apache.spark.sql.AnalysisException: Table or view not found: 
`global_temp`.`gView1`
'Project [*]
+- 'UnresolvedRelation `global_temp`.`gView1`

I am expecting that global temporary view created in shell 1 should be 
accessible in shell 2, but it isn’t!
Please correct me if I missing something here.

Thanks (in advance),
Hemanth

Reply via email to