Actually even that issue is resolved ☺ There is spelling difference in the 
sample available on the web, all of them have the storage class as 
“EsStorageHandler” however only Costin Lieu’s post says it is 
“ESStorageHandler” which is right !

Now the next problem, the MapReduce job is failing for some reason. I am still 
a beginner in Hadoop so not exactly sure where to debug. Here are some logs, 
looks some bad character “&#” in the job.xml file. But I that is generated by 
Hive right ?

Hive Log :------------------------------------------------
hive> insert overwrite table ex_address select name, st_no, st_name, city, 
state, zip from employee.address;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1417158738771_0001, Tracking URL = 
http://finattr-comp-dev-01:8088/proxy/application_1417158738771_0001/
Kill Command = /apps/hadoop-2.5.1/bin/hadoop job  -kill job_1417158738771_0001
Hadoop job information for Stage-0: number of mappers: 0; number of reducers: 0
2014-11-27 23:13:37,547 Stage-0 map = 0%,  reduce = 0%
Ended Job = job_1417158738771_0001 with errors
Error during job, obtaining debugging information...
FAILED: Execution Error, return code 2 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Job 0:  HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

Container Job Logs:

Stderr:---------------------------------------------
[sas@finattr-comp-dev-01 container_1417158738771_0001_02_000001]$ cat stderr
[Fatal Error] job.xml:606:51: Character reference "&#
log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.

Syslog:---------------------------------------------
[sas@finattr-comp-dev-01 container_1417158738771_0001_02_000001]$ cat syslog
2014-11-27 23:13:36,023 INFO [main] 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for 
application appattempt_1417158738771_0001_000002
2014-11-27 23:13:36,334 FATAL [main] org.apache.hadoop.conf.Configuration: 
error parsing conf job.xml
org.xml.sax.SAXParseException; systemId: 
file:///tmp/hadoop-sas/nm-local-dir/usercache/sas/appcache/application_1417158738771_0001/container_1417158738771_0001_02_000001/job.xml;
 lineNumber: 606; columnNumber: 51; Character reference "&#
        at 
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
        at 
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2183)
        at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2252)
        at 
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2205)
        at 
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2112)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1078)
        at 
org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:50)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1407)
2014-11-27 23:13:36,337 FATAL [main] 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: 
file:///tmp/hadoop-sas/nm-local-dir/usercache/sas/appcache/application_1417158738771_0001/container_1417158738771_0001_02_000001/job.xml;
 lineNumber: 606; columnNumber: 51; Character reference "&#
        at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2348)
        at 
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2205)
        at 
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2112)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1078)
        at 
org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:50)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1407)
Caused by: org.xml.sax.SAXParseException; systemId: 
file:///tmp/hadoop-sas/nm-local-dir/usercache/sas/appcache/application_1417158738771_0001/container_1417158738771_0001_02_000001/job.xml;
 lineNumber: 606; columnNumber: 51; Character reference "&#
        at 
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
        at 
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2183)
        at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2252)
        ... 5 more
2014-11-27 23:13:36,340 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting 
with status 1

Thanks
- Atul

From: Muthu Pandi [mailto:muthu1...@gmail.com]
Sent: Thursday, November 27, 2014 11:20 PM
To: user@hive.apache.org
Subject: Re: Can't integrate Elasticsearch with Hive

How did you get elasticsearch jar??

Try to build the jar against your version and use that jar.


Regards
Muthupandi.K

[Image removed by sender.] Think before you print.



On Fri, Nov 28, 2014 at 12:23 PM, Atul Paldhikar (apaldhik) 
<apald...@cisco.com<mailto:apald...@cisco.com>> wrote:
Thanks Muthu for pointing out the issue.

However unfortunately I am back to the same old issue while accessing the 
external table. I am getting the following exception again, there wasn’t any 
change in the environment.

I tried all the cleanup but still no luck :-( Here are the steps I tried

1. Removed the other jar for es-hadoop 2.1.x completely from the server

2. Updated the hive-site.xml as below, but didn't work

<property>
  <name>hive.aux.jars.path</name>
  
<value>/apps/sas/elasticsearch-hadoop-2.0.2/dist/elasticsearch-hadoop-2.0.2.jar</value>
  <description>A comma separated list (with no spaces) of the jar 
files</description>
</property>

3. Added the jar file in the hiveconf as below, still the same issue

hive --hiveconf 
hive.aux.jars.path=/apps/sas/elasticsearch-hadoop-2.0.2/dist/elasticsearch-hadoop-2.0.2.jar

4. Tried adding the jar file in the hive session, still didn't work

add jar 
/apps/sas/elasticsearch-hadoop-2.0.2/dist/elasticsearch-hadoop-2.0.2.jar;

Thanks
- Atul

ERROR:

2014-11-26 23:09:22,069 ERROR [main]: exec.DDLTask (DDLTask.java:execute(478)) 
- java.lang.IllegalAccessError: tried to access class 
org.elasticsearch.hadoop.hive.HiveUtils from class 
org.elasticsearch.hadoop.hive.EsSerDe
        at org.elasticsearch.hadoop.hive.EsSerDe.initialize(EsSerDe.java:81)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:339)
        at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:288)
        at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:281)
        at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:631)
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:593)
        at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
        at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
        at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

2014-11-26 23:09:22,069 ERROR [main]: ql.Driver 
(SessionState.java:printError(545)) - FAILED: Execution Error, return code 1 
from org.apache.hadoop.hive.ql.exec.DDLTask. tried to access class 
org.elasticsearch.hadoop.hive.HiveUtils from class 
org.elasticsearch.hadoop.hive.EsSerDe

From: Muthu Pandi [mailto:muthu1...@gmail.com<mailto:muthu1...@gmail.com>]
Sent: Thursday, November 27, 2014 9:08 PM
To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Re: Can't integrate Elasticsearch with Hive

Hi Atul

     Check your typo "tblproperties('es.resources' = 'employee/address') " need 
to be "es.resource" as it rightly indicated in the Error.


Regards
Muthupandi.K

[Image removed by sender.] Think before you print.



On Fri, Nov 28, 2014 at 3:43 AM, Edward Capriolo 
<edlinuxg...@gmail.com<mailto:edlinuxg...@gmail.com>> wrote:
Because the elastic search storage handler is not part of hive you might be 
able to find more help asking the author of the storage handler. From the 
errors you are reporting it likely does not work and not tested against this 
version of hive.


Reply via email to