[ 
https://issues.apache.org/jira/browse/SPARK-5391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14308627#comment-14308627
 ] 

Muthupandi K edited comment on SPARK-5391 at 2/6/15 5:13 AM:
-------------------------------------------------------------

Same error occurred when a table is created with json serde in hive table and 
queried from SparkQL.


was (Author: muthu):
Same error occoured when a table is created with json serde in hive table and 
queried from SparkQL.

> SparkSQL fails to create tables with custom JSON SerDe
> ------------------------------------------------------
>
>                 Key: SPARK-5391
>                 URL: https://issues.apache.org/jira/browse/SPARK-5391
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: David Ross
>
> - Using Spark built from trunk on this commit: 
> https://github.com/apache/spark/commit/bc20a52b34e826895d0dcc1d783c021ebd456ebd
> - Build for Hive13
> - Using this JSON serde: https://github.com/rcongiu/Hive-JSON-Serde
> First download jar locally:
> {code}
> $ curl 
> http://www.congiu.net/hive-json-serde/1.3/cdh5/json-serde-1.3-jar-with-dependencies.jar
>  > /tmp/json-serde-1.3-jar-with-dependencies.jar
> {code}
> Then add it in SparkSQL session:
> {code}
> add jar /tmp/json-serde-1.3-jar-with-dependencies.jar
> {code}
> Finally create table:
> {code}
> create table test_json (c1 boolean) ROW FORMAT SERDE 
> 'org.openx.data.jsonserde.JsonSerDe';
> {code}
> Logs for add jar:
> {code}
> 15/01/23 23:48:33 INFO thriftserver.SparkExecuteStatementOperation: Running 
> query 'add jar /tmp/json-serde-1.3-jar-with-dependencies.jar'
> 15/01/23 23:48:34 INFO session.SessionState: No Tez session required at this 
> point. hive.execution.engine=mr.
> 15/01/23 23:48:34 INFO SessionState: Added 
> /tmp/json-serde-1.3-jar-with-dependencies.jar to class path
> 15/01/23 23:48:34 INFO SessionState: Added resource: 
> /tmp/json-serde-1.3-jar-with-dependencies.jar
> 15/01/23 23:48:34 INFO spark.SparkContext: Added JAR 
> /tmp/json-serde-1.3-jar-with-dependencies.jar at 
> http://192.168.99.9:51312/jars/json-serde-1.3-jar-with-dependencies.jar with 
> timestamp 1422056914776
> 15/01/23 23:48:34 INFO thriftserver.SparkExecuteStatementOperation: Result 
> Schema: List()
> 15/01/23 23:48:34 INFO thriftserver.SparkExecuteStatementOperation: Result 
> Schema: List()
> {code}
> Logs (with error) for create table:
> {code}
> 15/01/23 23:49:00 INFO thriftserver.SparkExecuteStatementOperation: Running 
> query 'create table test_json (c1 boolean) ROW FORMAT SERDE 
> 'org.openx.data.jsonserde.JsonSerDe''
> 15/01/23 23:49:00 INFO parse.ParseDriver: Parsing command: create table 
> test_json (c1 boolean) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
> 15/01/23 23:49:01 INFO parse.ParseDriver: Parse Completed
> 15/01/23 23:49:01 INFO session.SessionState: No Tez session required at this 
> point. hive.execution.engine=mr.
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=Driver.run 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO ql.Driver: Concurrency mode is disabled, not creating 
> a lock manager
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=compile 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=parse 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO parse.ParseDriver: Parsing command: create table 
> test_json (c1 boolean) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
> 15/01/23 23:49:01 INFO parse.ParseDriver: Parse Completed
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=parse 
> start=1422056941103 end=1422056941104 duration=1 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
> 15/01/23 23:49:01 INFO parse.SemanticAnalyzer: Creating table test_json 
> position=13
> 15/01/23 23:49:01 INFO ql.Driver: Semantic Analysis Completed
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze 
> start=1422056941104 end=1422056941240 duration=136 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO ql.Driver: Returning Hive schema: 
> Schema(fieldSchemas:null, properties:null)
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=compile 
> start=1422056941071 end=1422056941252 duration=181 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=Driver.execute 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO ql.Driver: Starting command: create table test_json 
> (c1 boolean) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit 
> start=1422056941067 end=1422056941258 duration=191 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=runTasks 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 WARN security.ShellBasedUnixGroupsMapping: got exception 
> trying to get groups for user anonymous
> org.apache.hadoop.util.Shell$ExitCodeException: id: anonymous: No such user
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
>   at org.apache.hadoop.util.Shell.run(Shell.java:418)
>   at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:83)
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52)
>   at 
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:50)
>   at org.apache.hadoop.security.Groups.getGroups(Groups.java:139)
>   at 
> org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1409)
>   at 
> org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:63)
>   at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>   at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>   at 
> org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:424)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:377)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:867)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:589)
>   at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:174)
>   at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:116)
>   at org.apache.hadoop.hive.ql.metadata.Hive.newTable(Hive.java:2566)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4046)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>   at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>   at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>   at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>   at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>   at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
>   at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:264)
>   at 
> org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:37)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:53)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:53)
>   at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:61)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:474)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:474)
>   at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>   at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:107)
>   at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:73)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:160)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:212)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:483)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>   at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
>   at com.sun.proxy.$Proxy18.executeStatement(Unknown Source)
>   at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:220)
>   at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>   at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
>   at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> 15/01/23 23:49:01 WARN security.UserGroupInformation: No groups available for 
> user anonymous
> 15/01/23 23:49:01 WARN security.ShellBasedUnixGroupsMapping: got exception 
> trying to get groups for user anonymous
> org.apache.hadoop.util.Shell$ExitCodeException: id: anonymous: No such user
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
>   at org.apache.hadoop.util.Shell.run(Shell.java:418)
>   at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
>   at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:83)
>   at 
> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52)
>   at 
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:50)
>   at org.apache.hadoop.security.Groups.getGroups(Groups.java:139)
>   at 
> org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1409)
>   at 
> org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:64)
>   at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>   at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>   at 
> org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:424)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:377)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:867)
>   at 
> org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:589)
>   at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:174)
>   at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:116)
>   at org.apache.hadoop.hive.ql.metadata.Hive.newTable(Hive.java:2566)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4046)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>   at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>   at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>   at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>   at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>   at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
>   at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:264)
>   at 
> org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:37)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:53)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:53)
>   at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:61)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:474)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:474)
>   at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>   at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:107)
>   at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:73)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:160)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:212)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:483)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>   at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
>   at com.sun.proxy.$Proxy18.executeStatement(Unknown Source)
>   at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:220)
>   at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>   at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
>   at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> 15/01/23 23:49:01 WARN security.UserGroupInformation: No groups available for 
> user anonymous
> 15/01/23 23:49:01 ERROR exec.DDLTask: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Cannot validate serde: 
> org.openx.data.jsonserde.JsonSerDe
>   at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3952)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4084)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>   at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>   at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>   at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>   at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>   at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>   at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>   at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
>   at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:264)
>   at 
> org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:37)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:53)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:53)
>   at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:61)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:474)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:474)
>   at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>   at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:107)
>   at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:73)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:160)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:212)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:483)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>   at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
>   at com.sun.proxy.$Proxy18.executeStatement(Unknown Source)
>   at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:220)
>   at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>   at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
>   at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException: Class 
> org.openx.data.jsonserde.JsonSerDe not found
>   at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
>   at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3946)
>   ... 47 more
> 15/01/23 23:49:01 ERROR ql.Driver: FAILED: Execution Error, return code 1 
> from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot validate serde: 
> org.openx.data.jsonserde.JsonSerDe
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=Driver.execute 
> start=1422056941252 end=1422056941379 duration=127 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: <PERFLOG method=releaseLocks 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 INFO log.PerfLogger: </PERFLOG method=releaseLocks 
> start=1422056941379 end=1422056941379 duration=0 
> from=org.apache.hadoop.hive.ql.Driver>
> 15/01/23 23:49:01 ERROR hive.HiveContext:
> ======================
> HIVE FAILURE OUTPUT
> ======================
> SET spark.sql.codegen=false
> SET spark.sql.parquet.binaryAsString=true
> SET spark.sql.parquet.cacheMetadata=true
> SET spark.sql.hive.version=0.13.1
> SET spark.sql.autoBroadcastJoinThreshold=500000
> SET spark.sql.shuffle.partitions=10
> ADD JAR /tmp/json-serde-1.3-jar-with-dependencies.jar
> Added /tmp/json-serde-1.3-jar-with-dependencies.jar to class path
> Added resource: /tmp/json-serde-1.3-jar-with-dependencies.jar
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Cannot validate serde: 
> org.openx.data.jsonserde.JsonSerDe
> ======================
> END HIVE FAILURE OUTPUT
> ======================
> 15/01/23 23:49:01 ERROR thriftserver.SparkExecuteStatementOperation: Error 
> executing query:
> org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution 
> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot 
> validate serde: org.openx.data.jsonserde.JsonSerDe
>   at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:296)
>   at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:264)
>   at 
> org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:37)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:53)
>   at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:53)
>   at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:61)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:474)
>   at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:474)
>   at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>   at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:107)
>   at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:73)
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:160)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:212)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:483)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>   at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
>   at com.sun.proxy.$Proxy18.executeStatement(Unknown Source)
>   at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:220)
>   at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>   at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
>   at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> 15/01/23 23:49:01 WARN thrift.ThriftCLIService: Error executing statement:
> org.apache.hive.service.cli.HiveSQLException: 
> org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution 
> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot 
> validate serde: org.openx.data.jsonserde.JsonSerDe
>   at 
> org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.run(Shim13.scala:189)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:231)
>   at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:212)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:483)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>   at 
> org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:493)
>   at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60)
>   at com.sun.proxy.$Proxy18.executeStatement(Unknown Source)
>   at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:220)
>   at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:344)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>   at 
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>   at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>   at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>   at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:55)
>   at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>   at java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to