[ 
https://issues.apache.org/jira/browse/MAPREDUCE-6027?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14090325#comment-14090325
 ] 

Hadoop QA commented on MAPREDUCE-6027:
--------------------------------------

{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12660521/MAPREDUCE-6027.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:green}+1 tests included{color}.  The patch appears to include 1 new 
or modified test files.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

    {color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

    {color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core.

    {color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-MAPREDUCE-Build/4793//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-MAPREDUCE-Build/4793//console

This message is automatically generated.

> mr jobs with relative paths can fail
> ------------------------------------
>
>                 Key: MAPREDUCE-6027
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-6027
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: job submission
>            Reporter: Wing Yew Poon
>            Assignee: Wing Yew Poon
>         Attachments: MAPREDUCE-6027.patch
>
>
> I built hadoop from branch-2 and tried to run terasort as follows:
> {noformat}
> wypoon$ bin/hadoop jar 
> share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-SNAPSHOT.jar terasort 
> sort-input sort-output
> 14/08/07 08:57:55 INFO terasort.TeraSort: starting
> 2014-08-07 08:57:56.229 java[36572:1903] Unable to load realm info from 
> SCDynamicStore
> 14/08/07 08:57:56 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 14/08/07 08:57:57 INFO input.FileInputFormat: Total input paths to process : 2
> Spent 156ms computing base-splits.
> Spent 2ms computing TeraScheduler splits.
> Computing input splits took 159ms
> Sampling 2 splits of 2
> Making 1 from 100000 sampled records
> Computing parititions took 626ms
> Spent 789ms computing partitions.
> 14/08/07 08:57:57 INFO client.RMProxy: Connecting to ResourceManager at 
> localhost/127.0.0.1:8032
> 14/08/07 08:57:58 INFO mapreduce.JobSubmitter: Cleaning up the staging area 
> /tmp/hadoop-yarn/staging/wypoon/.staging/job_1407426900134_0001
> java.lang.IllegalArgumentException: Can not create a Path from an empty URI
>       at org.apache.hadoop.fs.Path.checkPathArg(Path.java:140)
>       at org.apache.hadoop.fs.Path.<init>(Path.java:192)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:275)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.ancestorsHaveExecutePermissions(ClientDistributedCacheManager.java:256)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:243)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
>       at 
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>       at org.apache.hadoop.examples.terasort.TeraSort.run(TeraSort.java:316)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>       at org.apache.hadoop.examples.terasort.TeraSort.main(TeraSort.java:325)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
>       at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
>       at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> {noformat}
> If I used absolute paths for the input and out directories, the job runs fine.
> This breakage is due to HADOOP-10876.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to