[ https://issues.apache.org/jira/browse/SPARK-16604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-16604. ---------------------------------- Resolution: Cannot Reproduce It sounds almost impossible to reproduce. I am resolving this. Unless the reporter is active and keeps reproducing this against the current master, I think no one is going to reproduce this or resolve this. Please reopen this can be reproduced in the current master. > Spark2.0 fail in executing the sql statement which includes partition field > in the "select" statement while spark1.6 supports > ----------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-16604 > URL: https://issues.apache.org/jira/browse/SPARK-16604 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: marymwu > > Spark2.0 fail in executing the sql statement which includes partition field > in the "select" statement > error: > 16/07/14 16:10:47 INFO HiveThriftServer2: set > sessionId(69e92ba1-4be2-4be9-bc81-7a00c5802ef8) to > exeId(c93f69b0-0f6e-4f07-afdc-ca6c41045fa3) > 16/07/14 16:10:47 INFO SparkSqlParser: Parsing command: INSERT OVERWRITE > TABLE > d_avatar.RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD_IMPALA > PARTITION(p_event_date='2016-07-13') > select > app_key, > app_version, > app_channel, > device_model, > total_num, > new_num, > active_num, > extant_num, > visits_num, > start_num, > p_event_date > from RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD where > p_event_date = '2016-07-13' > 16/07/14 16:10:47 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:47 INFO ThriftHttpServlet: Cookie added for clientUserName hive > 16/07/14 16:10:47 INFO HiveMetaStore: 108: get_table : db=default > tbl=rps__h_report_more_dimension_more_norm_first_channel_vcd > 16/07/14 16:10:47 INFO audit: ugi=u_reaper ip=unknown-ip-addr > cmd=get_table : db=default > tbl=rps__h_report_more_dimension_more_norm_first_channel_vcd > 16/07/14 16:10:47 INFO HiveMetaStore: 108: Opening raw store with > implemenation class:org.apache.hadoop.hive.metastore.ObjectStore > 16/07/14 16:10:47 INFO ObjectStore: ObjectStore, initialize called > 16/07/14 16:10:47 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:47 INFO ThriftHttpServlet: Cookie added for clientUserName hive > 16/07/14 16:10:47 INFO Query: Reading in results for query > "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is > closing > 16/07/14 16:10:47 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is > MYSQL > 16/07/14 16:10:47 INFO ObjectStore: Initialized ObjectStore > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO HiveMetaStore: 108: get_table : db=d_avatar > tbl=rps__h_report_more_dimension_more_norm_first_channel_vcd_impala > 16/07/14 16:10:47 INFO audit: ugi=u_reaper ip=unknown-ip-addr > cmd=get_table : db=d_avatar > tbl=rps__h_report_more_dimension_more_norm_first_channel_vcd_impala > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: string > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:47 INFO CatalystSqlParser: Parsing command: bigint > 16/07/14 16:10:49 WARN HiveSessionState$$anon$1: Max iterations (100) reached > for batch Resolution > 16/07/14 16:10:49 ERROR SparkExecuteStatementOperation: Error executing > query, currentState RUNNING, > org.apache.spark.sql.AnalysisException: unresolved operator 'InsertIntoTable > MetastoreRelation d_avatar, > rps__h_report_more_dimension_more_norm_first_channel_vcd_impala, None, > Map(p_event_date -> Some(2016-07-13)), true, false; > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:39) > at > org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:56) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:309) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:51) > at > org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:125) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:51) > at > org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:56) > at > org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:62) > at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:541) > at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:673) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:213) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:157) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:154) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:167) > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > 16/07/14 16:10:49 ERROR SparkExecuteStatementOperation: Error running hive > query: > org.apache.hive.service.cli.HiveSQLException: > org.apache.spark.sql.AnalysisException: unresolved operator 'InsertIntoTable > MetastoreRelation d_avatar, > rps__h_report_more_dimension_more_norm_first_channel_vcd_impala, None, > Map(p_event_date -> Some(2016-07-13)), true, false; > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:248) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:157) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:154) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:167) > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > 16/07/14 16:10:49 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:49 INFO ThriftHttpServlet: Cookie added for clientUserName hive > 16/07/14 16:10:49 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:49 INFO ThriftHttpServlet: Cookie added for clientUserName hive > 16/07/14 16:10:49 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:49 INFO ThriftHttpServlet: Cookie added for clientUserName hive > 16/07/14 16:10:49 INFO ThriftHttpServlet: Could not validate cookie sent, > will try to generate a new cookie > 16/07/14 16:10:49 INFO ThriftHttpServlet: Cookie added for clientUserName hive > Testcase: > spark sql: [ > INSERT OVERWRITE TABLE > d_avatar.RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD_IMPALA > PARTITION(p_event_date='2016-07-13') > select > app_key, > app_version, > app_channel, > device_model, > total_num, > new_num, > active_num, > extant_num, > visits_num, > start_num, > p_event_date > from RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD where > p_event_date = '2016-07-13'; > ], > jdbcUrl:[jdbc:hive2://10.0.160.51:10000/default;transportMode=http;httpPath=cliservice;principal=spark/slave426.avatar.lenovomm....@avatar.lenovomm.com;hive.server2.proxy.user=u_avatar], > param:[StatParam [appName=avatar, appUser=u_avatar, execDate=2016-07-13, > processId=1893, taskId=43966]], > errorInfo:[org.apache.spark.sql.AnalysisException: unresolved operator > 'InsertIntoTable MetastoreRelation d_avatar, > rps__h_report_more_dimension_more_norm_first_channel_vcd_impala, None, > Map(p_event_date -> Some(2016-07-13)), true, false;][home] > ------------------- > d_avatar.RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD_IMPALA > RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD > CREATE TABLE > `RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD`(`app_key` string, > `app_channel` string, `app_version` string, `device_model` string, > `total_num` bigint, `new_num` bigint, `active_num` bigint, `extant_num` > bigint, `visits_num` bigint, `start_num` bigint) > PARTITIONED BY (`p_event_date` string) > ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' > WITH SERDEPROPERTIES ('serialization.format' = '1' > ) > STORED AS > INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' > OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' > TBLPROPERTIES ('transient_lastDdlTime' = '1385178543'); > CREATE TABLE > `d_avatar.RPS__H_REPORT_MORE_DIMENSION_MORE_NORM_FIRST_CHANNEL_VCD_IMPALA`(`app_key` > string, `app_version` string, `app_channel` string, `device_model` string, > `total_uv` bigint, `new_user` bigint, `uv` bigint, `extant_user` bigint, `pv` > bigint, `start_times` bigint) > PARTITIONED BY (`p_event_date` string) > ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' > WITH SERDEPROPERTIES ('serialization.format' = '1' > ) > STORED AS > INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' > OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' > TBLPROPERTIES ('last_modified_by' = 'u_avatar', > 'last_modified_time' = '1449840124', > 'transient_lastDdlTime' = '1449840124'); -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org