Hi,
I was using the show function. Now I am able to read tables through sql. As I 
solved this issue with three steps:

1.       Copied core-site.xml,hdfs-site.xml and hive-site.xml into 
ZEPPELIN_HOME/conf folder.

2.       Use same sqlContext object for each function

3.       By adding  import sqlContext.implicits._

From: mina lee [mailto:[email protected]]
Sent: Friday, July 22, 2016 3:06 PM
To: [email protected]
Subject: Re: Spark-sql showing no table

Hi Vikash,

if you want to render dataframe as a table with sqlContext, you will need to run
z.show(tables)

On Thu, Jul 14, 2016 at 1:22 PM Vikash Kumar 
<[email protected]<mailto:[email protected]>> wrote:
I am creating a sqlContext from exiting sc.
                Var tables = sqlContext.sql(“show tables”)


Thanks and regards,
Vikash Kumar

From: Mohit Jaggi [mailto:[email protected]<mailto:[email protected]>]
Sent: Wednesday, July 13, 2016 10:24 PM
To: [email protected]<mailto:[email protected]>
Subject: Re: Spark-sql showing no table

make sure you use a hive context

On Jul 13, 2016, at 12:42 AM, Vikash Kumar 
<[email protected]<mailto:[email protected]>> wrote:

Hi all,
                I am using spark with scala to read phoenix tables and register 
as temporary table. Which I am able to do.
                After that when I am running query :
                                %sql show tables
                Its giving all possible output, but when I am running same 
query with scala sqlContext ,then its not showing any table or neither giving 
any error.
                What should I do now because I also copied 
core-site.xml,hdfs-site.xml,hbase-site.xml and hive-site.xml inot zeppelin conf 
folder?

Reply via email to