[ 
https://issues.apache.org/jira/browse/SPARK-4773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Ross resolved SPARK-4773.
-------------------------------
    Resolution: Fixed

Looks like this was broken by: 
https://github.com/apache/spark/commit/4b55482abf899c27da3d55401ad26b4e9247b327

And fixed by: 
https://github.com/apache/spark/commit/51b1fe1426ffecac6c4644523633ea1562ff9a4e

Thanks for quick turnaround!

> CTAS Doesn't Use the Current Schema
> -----------------------------------
>
>                 Key: SPARK-4773
>                 URL: https://issues.apache.org/jira/browse/SPARK-4773
>             Project: Spark
>          Issue Type: Bug
>            Reporter: David Ross
>
> In a CTAS  (CREATE TABLE __ AS SELECT __), the current schema isn't used. For 
> example, this all works:
> {code}
> CREATE DATABASE test_db;
> USE test_db;
> CREATE TABLE test_table_1(s string);
> SELECT * FROM test_table_1;
> CREATE TABLE test_table_2 AS SELECT * FROM test_db.test_table_1;
> SELECT * FROM test_table_2;
> {code}
> But this fails:
> {code}
> CREATE TABLE test_table_3 AS SELECT * FROM test_table_1;
> {code}
> Message:
> {code}
> 14/12/06 00:28:57 ERROR thriftserver.SparkExecuteStatementOperation: Error 
> executing query:
> org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:43 Table not found 
> 'test_table_1'
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1324)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1053)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8342)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:284)
>       at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation$lzycompute(CreateTableAsSelect.scala:59)
>       at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.metastoreRelation(CreateTableAsSelect.scala:55)
>       at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.sideEffectResult$lzycompute(CreateTableAsSelect.scala:82)
>       at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.sideEffectResult(CreateTableAsSelect.scala:70)
>       at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.execute(CreateTableAsSelect.scala:89)
>       at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>       at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>       at 
> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>       at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>       at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim12.scala:190)
>       at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:193)
>       at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:175)
>       at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:150)
>       at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:207)
>       at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)
>       at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)
>       at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>       at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:58)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:55)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>       at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:55)
>       at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:43 Table 
> not found 'test_table_1'
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1079)
>       ... 33 more
> 14/12/06 00:28:57 WARN thrift.ThriftCLIService: Error fetching results:
> org.apache.hive.service.cli.HiveSQLException: 
> org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:43 Table not found 
> 'test_table_1'
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim12.scala:221)
>       at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:193)
>       at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:175)
>       at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:150)
>       at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:207)
>       at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)
>       at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)
>       at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>       at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:58)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:55)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>       at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
>       at 
> org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:55)
>       at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to