[ 
https://issues.apache.org/jira/browse/AMBARI-7618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14157462#comment-14157462
 ] 

Hadoop QA commented on AMBARI-7618:
-----------------------------------

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12672676/AMBARI-7618.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
                        Please justify why no new tests are needed for this 
patch.
                        Also please list what manual steps were performed to 
verify this patch.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:red}-1 core tests{color}.  The patch failed these unit tests in 
ambari-server:

                  org.apache.ambari.server.actionmanager.TestActionScheduler

Test results: 
https://builds.apache.org/job/Ambari-trunk-test-patch/77//testReport/
Console output: 
https://builds.apache.org/job/Ambari-trunk-test-patch/77//console

This message is automatically generated.

> Hive load local command fails due to not finding hcatalog-core.jar 
> -------------------------------------------------------------------
>
>                 Key: AMBARI-7618
>                 URL: https://issues.apache.org/jira/browse/AMBARI-7618
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 1.7.0
>            Reporter: Alejandro Fernandez
>            Assignee: Alejandro Fernandez
>             Fix For: 1.7.0
>
>         Attachments: AMBARI-7618.patch
>
>
> Created a cluster using Ambari with the latest Champlain bits on a centos 6.4 
> VM.
> Then attempted to load some sample data,
> {code}
> cd /tmp
> wget http://seanlahman.com/files/database/lahman591-csv.zip
> unzip lahman591-csv.zip
> su - hive
> /usr/hdp/2.2.0.0-806/hadoop/bin/hadoop fs -copyFromLocal Schools.csv /tmp
> /usr/hdp/2.2.0.0-806/hive/bin/hive
> CREATE TABLE school (id STRING, name STRING, city STRING, state STRING, nick 
> STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' STORED AS TEXTFILE;
> LOAD DATA LOCAL INPATH '/tmp/Schools.csv' INTO TABLE school;
> SELECT name FROM school ORDER BY name ASC LIMIT 10;
> {code}
> The SELECT statement fails with the following error,
> {code}
> java.io.FileNotFoundException: File 
> file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:524)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:737)
>       at 
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:514)
>       at 
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
>       at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
>       at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.copyRemoteFiles(JobSubmitter.java:140)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:213)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>       at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
>       at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
>       at 
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420)
>       at 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>       at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:161)
>       at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>       at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1603)
>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1363)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1176)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1003)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:993)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:246)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:198)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:408)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Job Submission failed with exception 'java.io.FileNotFoundException(File 
> file:/usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar does not exist)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> {code}
> The hive property in hive-site.xml for HDP 2.2 that controls where to load 
> user defined jars from needs to change now that HDP has versioned RPMs.
> This property should contain
> {code}
> hive.aux.jars.path=file:///usr/hdp/current/hive-hcatalog/share/hcatalog
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to