I am trying hive using java jdbc client, i can execute simple queries like
select * from table and select * from table where someting="someting" but
when i am going for join queries it throwing me the following error:

*
In my Netbean ide code this is the exception*:

Running: SELECT * FROM sampletab1 sp1  JOIN sampletab12 sp2 ON (sp1.id =
sp2.id) limit 10
Exception in thread "main" java.sql.SQLException: Query returned non-zero
code: 9, cause: FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.MapRedTask
    at
org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
    at witsmlstore.HiveJdbcJava.main(HiveJdbcJava.java:77)
Java Result: 1


*and at the server consol this is the output :*


WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please
use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties
files.
Hive history
file=/tmp/shashwat/hive_job_log_shashwat_201203191222_772980239.txt
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201203191156_0003, Tracking URL =
http://localhost:50030/jobdetails.jsp?jobid=job_201203191156_0003
Kill Command = /home/shashwat/Hadoop/hadoop-0.20.205/libexec/../bin/hadoop
job  -Dmapred.job.tracker=localhost:9001 -kill job_201203191156_0003
2012-03-19 12:23:07,239 Stage-1 map = 0%,  reduce = 0%
2012-03-19 12:24:01,453 Stage-1 map = 100%,  reduce = 100%
Ended Job = job_201203191156_0003 with errors
FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.MapRedTask


*
When is checked in the task tracker the error was :*

*@
http://localhost:50030/taskdetails.jsp?tipid=task_201203191156_0003_m_000000
*

java.io.IOException: Cannot create an instance of InputSplit class =
org.apache.hadoop.hive.hbase.HBaseSplit:org.apache.hadoop.hive.hbase.HBaseSplit
        at 
org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:145)
        at 
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
        at 
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
        at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:396)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)



What may be the probable cause

When i am trying embeded mode its throwing me error as follows:



12/03/19 12:27:39 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
12/03/19 12:27:39 INFO metastore.ObjectStore: ObjectStore, initialize called
12/03/19 12:27:39 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
12/03/19 12:27:39 INFO DataNucleus.Persistence: Property
javax.jdo.option.NonTransactionalRead unknown - will be ignored
12/03/19 12:27:39 INFO DataNucleus.Persistence: =================
Persistence Configuration ===============
12/03/19 12:27:39 INFO DataNucleus.Persistence: DataNucleus Persistence
Factory - Vendor: "DataNucleus"  Version: "2.0.3"
12/03/19 12:27:39 INFO DataNucleus.Persistence: DataNucleus Persistence
Factory initialised for datastore
URL="jdbc:derby:;databaseName=metastore_db;create=true"
driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
12/03/19 12:27:39 INFO DataNucleus.Persistence:
===========================================================
12/03/19 12:27:42 INFO Datastore.Schema: Initialising Catalog "", Schema
"APP" using "None" auto-start option
12/03/19 12:27:42 INFO Datastore.Schema: Catalog "", Schema "APP"
initialised - managing 0 classes
12/03/19 12:27:42 INFO metastore.ObjectStore: Setting MetaStore object pin
classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
12/03/19 12:27:42 INFO DataNucleus.MetaData: Registering listener for
metadata initialisation
12/03/19 12:27:42 INFO metastore.ObjectStore: Initialized ObjectStore
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element
'jdo'. - Please check your specification of DTD and the validity of the
MetaData XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 312, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 359, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 381, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 416, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 453, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 494, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 535, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 576, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 621, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 WARN DataNucleus.MetaData: MetaData Parser encountered an
error in file
"jar:file:/home/shashwat/Hadoop/hive-0.7.1/lib/hive-metastore-0.7.1.jar!/package.jdo"
at line 666, column 13 : The content of element type "class" must match
"(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)".
- Please check your specification of DTD and the validity of the MetaData
XML that you have specified.
12/03/19 12:27:42 INFO DataNucleus.Persistence: Managing Persistence of
Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS,
InheritanceStrategy : new-table]
12/03/19 12:27:42 INFO DataNucleus.Persistence: Managing Persistence of
Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table
: DATABASE_PARAMS]
12/03/19 12:27:42 INFO Datastore.Schema: Validating 2 unique key(s) for
table DBS
12/03/19 12:27:42 INFO Datastore.Schema: Validating 0 foreign key(s) for
table DBS
12/03/19 12:27:42 INFO Datastore.Schema: Validating 2 index(es) for table
DBS
12/03/19 12:27:42 INFO Datastore.Schema: Validating 1 unique key(s) for
table DATABASE_PARAMS
12/03/19 12:27:42 INFO Datastore.Schema: Validating 1 foreign key(s) for
table DATABASE_PARAMS
12/03/19 12:27:42 INFO Datastore.Schema: Validating 2 index(es) for table
DATABASE_PARAMS
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
    at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
    at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
    at
org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
    at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
    at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
    at
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
    at
org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
    at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:189)
    at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
    at
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
    at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
    at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
    at
org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1436)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1337)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:244)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:122)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
    at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:93)
    at
org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:125)
    at
org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:140)
    at
org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:146)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:434)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.access$200(HiveMetaStore.java:109)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$5.run(HiveMetaStore.java:454)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$5.run(HiveMetaStore.java:451)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:307)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:191)
    at
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:80)
    at
org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:75)
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at witsmlstore.HiveJdbcJava.main(HiveJdbcJava.java:29)
Caused by: java.lang.ClassNotFoundException:
org.apache.commons.configuration.Configuration
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    ... 36 more
Java Result: 1


Please guide me where i am missing things. i have added the jar files in my
project, i am able to create tables and fetch result from the hive shell,
the same query is give me result from the shell.

Regards
-- 
Shashwat Shriparv

Reply via email to