Hi,
I was using the show function. Now I am able to read tables through sql. As I 
solved this issue with three steps:

1.       Copied core-site.xml,hdfs-site.xml and hive-site.xml into 
ZEPPELIN_HOME/conf folder.

2.       Use same sqlContext object for each function

3.       By adding  import sqlContext.implicits._

From: mina lee [mailto:mina...@apache.org]
Sent: Friday, July 22, 2016 3:06 PM
To: users@zeppelin.apache.org
Subject: Re: Spark-sql showing no table

Hi Vikash,

if you want to render dataframe as a table with sqlContext, you will need to run
z.show(tables)

On Thu, Jul 14, 2016 at 1:22 PM Vikash Kumar 
<vikash.ku...@resilinc.com<mailto:vikash.ku...@resilinc.com>> wrote:
I am creating a sqlContext from exiting sc.
                Var tables = sqlContext.sql(“show tables”)


Thanks and regards,
Vikash Kumar

From: Mohit Jaggi [mailto:mohitja...@gmail.com<mailto:mohitja...@gmail.com>]
Sent: Wednesday, July 13, 2016 10:24 PM
To: users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>
Subject: Re: Spark-sql showing no table

make sure you use a hive context

On Jul 13, 2016, at 12:42 AM, Vikash Kumar 
<vikash.ku...@resilinc.com<mailto:vikash.ku...@resilinc.com>> wrote:

Hi all,
                I am using spark with scala to read phoenix tables and register 
as temporary table. Which I am able to do.
                After that when I am running query :
                                %sql show tables
                Its giving all possible output, but when I am running same 
query with scala sqlContext ,then its not showing any table or neither giving 
any error.
                What should I do now because I also copied 
core-site.xml,hdfs-site.xml,hbase-site.xml and hive-site.xml inot zeppelin conf 
folder?

Reply via email to