Hi

I am trying to compute stats on a lookup table from spark which resides in 
hive. I am invoking spark API as follows. It gives me NoSuchTableException. 
Table is double verified and subsequent statement “sqlContext.sql(“select * 
from cpatext.lkup”)” picks up the table correctly. I am wondering whether it is 
related to https://issues.apache.org/jira/browse/SPARK-8105 
<https://issues.apache.org/jira/browse/SPARK-8105>. I am using Spark 1.4.1 
Please let me know.

scala> sqlContext.sql("ANALYZE TABLE cpatext.lkup COMPUTE STATISTICS NOSCAN")
2015-08-18 18:12:19,299 INFO  [main] parse.ParseDriver 
(ParseDriver.java:parse(185)) - Parsing command: ANALYZE TABLE cpatext.lkup 
COMPUTE STATISTICS NOSCAN
2015-08-18 18:12:19,299 INFO  [main] parse.ParseDriver 
(ParseDriver.java:parse(206)) - Parse Completed
org.apache.spark.sql.catalyst.analysis.NoSuchTableException
        at 
org.apache.spark.sql.hive.client.ClientInterface$$anonfun$getTable$1.apply(ClientInterface.scala:112)
        at 
org.apache.spark.sql.hive.client.ClientInterface$$anonfun$getTable$1.apply(ClientInterface.scala:112)
        at scala.Option.getOrElse(Option.scala:120)
        at 
org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:112)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60)
        at 
org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:227)
        at 
org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:371)
        at 
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:165)
        at 
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:165)
        at scala.Option.getOrElse(Option.scala:120)
        at 
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165)
        at 
org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:371)
        at org.apache.spark.sql.hive.HiveContext.analyze(HiveContext.scala:293)
        at 
org.apache.spark.sql.hive.execution.AnalyzeTable.run(commands.scala:43)
        at 
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
        at 
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)

Thanks
Vijay

Reply via email to