What is the Hive version of your metastore server?

Looks like you are using a Hive 1.2's metastore client talking to your
existing Hive 0.13.1 metastore server?

On Thu, Sep 10, 2015 at 10:48 AM, Michael Armbrust <mich...@databricks.com>
wrote:

> Can you open a JIRA?
>
> On Wed, Sep 9, 2015 at 11:11 PM, StanZhai <m...@zhaishidan.cn> wrote:
>
>> After upgrade spark from 1.4.1 to 1.5.0, I encountered the following
>> exception when use alter table statement in HiveContext:
>>
>> The sql is: ALTER TABLE a RENAME TO b
>>
>> The exception is:
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. Invalid
>> method name: 'alter_table_with_cascade'
>> msg: org.apache.spark.sql.execution.QueryExecutionException: FAILED:
>> Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask.
>> Unable to alter table. Invalid method name: 'alter_table_with_cascade'
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:433)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:418)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:418)
>>         at
>>
>> org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:408)
>>         at
>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:558)
>>         at
>>
>> org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33)
>>         at
>>
>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
>>         at
>>
>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
>>         at
>>
>> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
>>         at
>>
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
>>         at
>>
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
>>         at
>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>>         at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
>>         at
>>
>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:927)
>>         at
>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:927)
>>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
>>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:129)
>>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>         at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:719)
>>         at test.service.QueryService.query(QueryService.scala:28)
>>         at
>> test.api.DatabaseApi$$anonfun$query$1.apply(DatabaseApi.scala:39)
>>         at
>> test.api.DatabaseApi$$anonfun$query$1.apply(DatabaseApi.scala:30)
>>         at test.web.JettyUtils$$anon$1.getOrPost(JettyUtils.scala:81)
>>         at test.web.JettyUtils$$anon$1.doPost(JettyUtils.scala:119)
>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:755)
>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
>>         at
>> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
>>         at
>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
>>         at
>>
>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
>>         at
>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
>>         at
>>
>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
>>         at
>>
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>>         at
>> org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
>>         at
>>
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>>         at org.eclipse.jetty.server.Server.handle(Server.java:370)
>>         at
>>
>> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
>>         at
>>
>> org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)
>>         at
>>
>> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)
>>         at
>> org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)
>>         at
>> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
>>         at
>>
>> org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
>>         at
>>
>> org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
>>         at
>>
>> org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
>>         at
>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>>         at
>>
>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> The sql can be run both at Spark 1.4.1 and Hive, I think this should be a
>> bug of Spark 1.5, Any suggestion?
>>
>> Best, Stan
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-developers-list.1001551.n3.nabble.com/SparkSQL-Could-not-alter-table-in-Spark-1-5-use-HiveContext-tp14029.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>

Reply via email to