[jira] [Commented] (HADOOP-6311) Add support for unix domain sockets to JNI libs

2012-11-05 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-6311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491217#comment-13491217
 ] 

Hadoop QA commented on HADOOP-6311:
---

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12552207/HADOOP-6311.023.patch
  against trunk revision .

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+1 tests included{color}.  The patch appears to include 3 new 
or modified test files.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:red}-1 javadoc{color}.  The javadoc tool appears to have generated 6 
warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:red}-1 findbugs{color}.  The patch appears to introduce 1 new 
Findbugs (version 1.3.9) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:red}-1 core tests{color}.  The patch failed these unit tests in 
hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs:

  org.apache.hadoop.net.unix.TestDomainSocket
  org.apache.hadoop.hdfs.server.namenode.TestBackupNode

{color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1710//testReport/
Findbugs warnings: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1710//artifact/trunk/patchprocess/newPatchFindbugsWarningshadoop-common.html
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1710//console

This message is automatically generated.

> Add support for unix domain sockets to JNI libs
> ---
>
> Key: HADOOP-6311
> URL: https://issues.apache.org/jira/browse/HADOOP-6311
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: native
>Affects Versions: 0.20.0
>Reporter: Todd Lipcon
>Assignee: Colin Patrick McCabe
> Attachments: 6311-trunk-inprogress.txt, design.txt, 
> HADOOP-6311.014.patch, HADOOP-6311.016.patch, HADOOP-6311.018.patch, 
> HADOOP-6311.020b.patch, HADOOP-6311.020.patch, HADOOP-6311.021.patch, 
> HADOOP-6311.022.patch, HADOOP-6311.023.patch, HADOOP-6311-0.patch, 
> HADOOP-6311-1.patch, hadoop-6311.txt
>
>
> For HDFS-347 we need to use unix domain sockets. This JIRA is to include a 
> library in common which adds a o.a.h.net.unix package based on the code from 
> Android (apache 2 license)

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8615) EOFException in DecompressorStream.java needs to be more verbose

2012-11-05 Thread thomastechs (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491202#comment-13491202
 ] 

thomastechs commented on HADOOP-8615:
-

Thanks Andy for the review. To incorporate these fixes, should I take the 
latest from the trunk again, since it is 1 week past now.? Or shall I edit 
these space fixes in the patch itself.? I am new to this JIRA process.So, 
please let me know any thoughts.
Thanks,
Thomas.

> EOFException in DecompressorStream.java needs to be more verbose
> 
>
> Key: HADOOP-8615
> URL: https://issues.apache.org/jira/browse/HADOOP-8615
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: io
>Affects Versions: 0.20.2
>Reporter: Jeff Lord
>  Labels: patch
> Attachments: HADOOP-8615.patch, HADOOP-8615-release-0.20.2.patch, 
> HADOOP-8615-ver2.patch
>
>
> In ./src/core/org/apache/hadoop/io/compress/DecompressorStream.java
> The following exception should at least pass back the file that it encounters 
> this error in relation to:
>   protected void getCompressedData() throws IOException {
> checkStream();
> int n = in.read(buffer, 0, buffer.length);
> if (n == -1) {
>   throw new EOFException("Unexpected end of input stream");
> }
> This would help greatly to debug bad/corrupt files.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491195#comment-13491195
 ] 

Hadoop QA commented on HADOOP-9004:
---

{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12552200/HADOOP-9004.patch.008
  against trunk revision .

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+1 tests included{color}.  The patch appears to include 5 new 
or modified test files.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  The javadoc tool did not generate any 
warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 1.3.9) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs.

{color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1709//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1709//console

This message is automatically generated.

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007, 
> HADOOP-9004.patch.008
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-6311) Add support for unix domain sockets to JNI libs

2012-11-05 Thread Colin Patrick McCabe (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-6311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491182#comment-13491182
 ] 

Colin Patrick McCabe commented on HADOOP-6311:
--

This latest patch is a new approach, based on the ideas we discussed here.  
Basically, the idea is to add a set of generic Socket classes that implement 
UNIX domain sockets.

This patch contains no code from Android.  The advantages over the previous 
approaches are:
* The previous approaches didn't create classes which actually inherited from 
{{ServerSocket}} and {{Socket}}.  This patch does.  This means that 
{{DomainSockets}} can be used in all of the places where {{InetSockets}} were 
previously used (see the HDFS-347 patch for the application.)
* This patch has unit tests.
* We skip implementing a few things that we don't really need, like 
credential-passing
* These classes are fully thread-safe.

> Add support for unix domain sockets to JNI libs
> ---
>
> Key: HADOOP-6311
> URL: https://issues.apache.org/jira/browse/HADOOP-6311
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: native
>Affects Versions: 0.20.0
>Reporter: Todd Lipcon
>Assignee: Colin Patrick McCabe
> Attachments: 6311-trunk-inprogress.txt, design.txt, 
> HADOOP-6311.014.patch, HADOOP-6311.016.patch, HADOOP-6311.018.patch, 
> HADOOP-6311.020b.patch, HADOOP-6311.020.patch, HADOOP-6311.021.patch, 
> HADOOP-6311.022.patch, HADOOP-6311.023.patch, HADOOP-6311-0.patch, 
> HADOOP-6311-1.patch, hadoop-6311.txt
>
>
> For HDFS-347 we need to use unix domain sockets. This JIRA is to include a 
> library in common which adds a o.a.h.net.unix package based on the code from 
> Android (apache 2 license)

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-6311) Add support for unix domain sockets to JNI libs

2012-11-05 Thread Colin Patrick McCabe (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-6311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Colin Patrick McCabe updated HADOOP-6311:
-

Attachment: HADOOP-6311.023.patch

> Add support for unix domain sockets to JNI libs
> ---
>
> Key: HADOOP-6311
> URL: https://issues.apache.org/jira/browse/HADOOP-6311
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: native
>Affects Versions: 0.20.0
>Reporter: Todd Lipcon
>Assignee: Colin Patrick McCabe
> Attachments: 6311-trunk-inprogress.txt, design.txt, 
> HADOOP-6311.014.patch, HADOOP-6311.016.patch, HADOOP-6311.018.patch, 
> HADOOP-6311.020b.patch, HADOOP-6311.020.patch, HADOOP-6311.021.patch, 
> HADOOP-6311.022.patch, HADOOP-6311.023.patch, HADOOP-6311-0.patch, 
> HADOOP-6311-1.patch, hadoop-6311.txt
>
>
> For HDFS-347 we need to use unix domain sockets. This JIRA is to include a 
> library in common which adds a o.a.h.net.unix package based on the code from 
> Android (apache 2 license)

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8974) TestDFVariations fails on Windows

2012-11-05 Thread Ivan Mitic (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491145#comment-13491145
 ] 

Ivan Mitic commented on HADOOP-8974:


Thanks Chris for the patch, looks good, +1

> TestDFVariations fails on Windows
> -
>
> Key: HADOOP-8974
> URL: https://issues.apache.org/jira/browse/HADOOP-8974
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: 3.0.0, trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-8974.patch
>
>
> The test fails on Windows.  This may be related to code ported in to DF.java 
> from branch-1-win.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8958) ViewFs:Non absolute mount name failures when running multiple tests on Windows

2012-11-05 Thread Ivan Mitic (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491142#comment-13491142
 ] 

Ivan Mitic commented on HADOOP-8958:


Patch looks good Chris, +1

> ViewFs:Non absolute mount name failures when running multiple tests on Windows
> --
>
> Key: HADOOP-8958
> URL: https://issues.apache.org/jira/browse/HADOOP-8958
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: viewfs
>Affects Versions: 3.0.0, trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-8958.patch
>
>
> This appears to be an issue with parsing a Windows-specific path.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Stephen Chu (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491135#comment-13491135
 ] 

Stephen Chu commented on HADOOP-9004:
-

Thanks for reviewing, Jitendra.

bq. Do you plan to write tests that work with Apache DS kerberos as well. The 
advantage is that developers will not have to bother about setting up a kdc. 
Ideal would be that these test cases could be integrated with commit builds.

Yeah, I do plan to keep in mind that it's a pain to set up your own KDC just to 
run these test cases. I'll take a look into Apache DS (not very familiar with 
it right now). I'll put effort into whatever way we discover is best to get 
these integrated into commit builds.

I uploaded a new patch addressing your comment.

Thanks!

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007, 
> HADOOP-9004.patch.008
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Stephen Chu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stephen Chu updated HADOOP-9004:


Attachment: HADOOP-9004.patch.008

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007, 
> HADOOP-9004.patch.008
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8951) RunJar to fail with user-comprehensible error message if jar missing

2012-11-05 Thread Eli Collins (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491115#comment-13491115
 ] 

Eli Collins commented on HADOOP-8951:
-

I'd use the following in case this is ever called from tests:

{code}
ExitUtil#terminate(0, "Not a valid JAR: " + file.getCanonicalPath());
{code}


> RunJar to fail with user-comprehensible error message if jar missing
> 
>
> Key: HADOOP-8951
> URL: https://issues.apache.org/jira/browse/HADOOP-8951
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: util
>Affects Versions: 1.1.0, 2.0.2-alpha
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Minor
> Fix For: 1.2.0, 2.0.3-alpha
>
> Attachments: HADOOP-8951.patch
>
>
> When the RunJar JAR is missing or not a file, exit with a meaningful message.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8952) Command-line option to see replicated size of a file

2012-11-05 Thread Eli Collins (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491114#comment-13491114
 ] 

Eli Collins commented on HADOOP-8952:
-

New option to hadoop fs -du?

> Command-line option to see replicated size of a file
> 
>
> Key: HADOOP-8952
> URL: https://issues.apache.org/jira/browse/HADOOP-8952
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: fs
>Reporter: Clint Heath
>Priority: Minor
>  Labels: newbie
>
> It would be nice to have a command-line option to the "hadoop -fs" command 
> that would print the actual replicated size of a file or filesystem.  In 
> other words, don't just print the size of a file if it were to be copied down 
> to a local FS, but show us exactly how much space it's taking up in HDFS 
> (including replicated blocks).  This should be added to the FsUsage class.  
> I'll work up a patch and attach it.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Jitendra Nath Pandey (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491098#comment-13491098
 ] 

Jitendra Nath Pandey commented on HADOOP-9004:
--

bq. In future JIRAs, I plan to add more security-related unit tests to cover 
pain points and stuff that's broken in the past (e.g. hdfs fsck and groups have 
once been broken with security enabled).

That will be very helpful. Thanks Stephen!

Do you plan to write tests that work with Apache DS kerberos as well. The 
advantage is that developers will not have to bother about setting up a kdc. 
Ideal would be that these test cases could be integrated with commit builds.

The patch looks good to me. +1
Minor: You don't have to assertTrue in the finally clause in testSecureNameNode.

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8956) FileSystem.primitiveMkdir failures on Windows cause multiple test suites to fail

2012-11-05 Thread Ivan Mitic (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8956?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491086#comment-13491086
 ] 

Ivan Mitic commented on HADOOP-8956:


Chris, apologies for the late review. I have a few comments on this patch.

1. RawLocalFs#createSymlink
I think we want to align this with some of our long term goals. The problem 
with the current code is that you’ll pass a path with forward slashes to 
Shell.WINUTILS. This does not work in all the cases (one example is path longer 
than 255 chars). The right thing to do here is to first convert a Path to a 
File, and let File#getCanonicalPath() do the right conversion for you.

To fix this I would propose we do the following:
 - Expose Path#toFile() API on the Path object
 - Convert Path to File within createSymlink and use File(s) going forward

This aligns well with our long term plans and would be a great thing to do! 
Daryn had a similar proposal on some of branch-1 Jiras.

2. FileContextURIBase
Currently, isTestableFileNameOnPlatform() will disable testing of some paths on 
Windows. However, I think the right thing to do is disable these tests only for 
the LocalFileSystem on Windows. Other FileSystems, like HDFS and S3 will 
continue to support the complete character set, and we should run the full test 
suit. Am I right? Might be that I’m just reading the change incorrectly. One 
way to fix this is to ask for fileNames/dirNamas and allow inherited classes to 
override the default set. We did something similar in HADOOP-8487.

3. TestPath#testGlobEscapeStatus()
I see that you disabled this test on Windows. It would be good to enable the 
test “somehow” but I cannot think of a good way. If we could spin up a 
MiniDFSCluster we could test this, but if I’m reading the trunk code correctly, 
this would break layering so it’s not an option. In lack of a better option, 
I’m fine with this.

I believe it should be fine to prepare a patch that goes on top of what is 
currently in the branch.


> FileSystem.primitiveMkdir failures on Windows cause multiple test suites to 
> fail
> 
>
> Key: HADOOP-8956
> URL: https://issues.apache.org/jira/browse/HADOOP-8956
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Fix For: trunk-win
>
> Attachments: HADOOP-8956-branch-trunk-win.patch, 
> HADOOP-8956-branch-trunk-win.patch, HADOOP-8956-branch-trunk-win.patch
>
>
> Multiple test suites fail on Windows in calls to FileSystem.primitiveMkdir.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Reopened] (HADOOP-8601) Extend TestShell to cover Windows shell commands

2012-11-05 Thread Suresh Srinivas (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Suresh Srinivas reopened HADOOP-8601:
-


> Extend TestShell to cover Windows shell commands
> 
>
> Key: HADOOP-8601
> URL: https://issues.apache.org/jira/browse/HADOOP-8601
> Project: Hadoop Common
>  Issue Type: Test
>Affects Versions: 1-win
>Reporter: Chuan Liu
>Assignee: Chuan Liu
> Fix For: 1-win
>
>
> The existing unit test only covers Linux shell commands. Since we begin to 
> support Windows and use completely different commands on Windows, it make 
> sense to extend TestShell to cover Windows use cases.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Resolved] (HADOOP-8601) Extend TestShell to cover Windows shell commands

2012-11-05 Thread Suresh Srinivas (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Suresh Srinivas resolved HADOOP-8601.
-

Resolution: Invalid

Changing the resolution to Invalid.

> Extend TestShell to cover Windows shell commands
> 
>
> Key: HADOOP-8601
> URL: https://issues.apache.org/jira/browse/HADOOP-8601
> Project: Hadoop Common
>  Issue Type: Test
>Affects Versions: 1-win
>Reporter: Chuan Liu
>Assignee: Chuan Liu
> Fix For: 1-win
>
>
> The existing unit test only covers Linux shell commands. Since we begin to 
> support Windows and use completely different commands on Windows, it make 
> sense to extend TestShell to cover Windows use cases.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8958) ViewFs:Non absolute mount name failures when running multiple tests on Windows

2012-11-05 Thread Ivan Mitic (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491050#comment-13491050
 ] 

Ivan Mitic commented on HADOOP-8958:


Chris, I'm reviewing this now, please give me some time.

> ViewFs:Non absolute mount name failures when running multiple tests on Windows
> --
>
> Key: HADOOP-8958
> URL: https://issues.apache.org/jira/browse/HADOOP-8958
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: viewfs
>Affects Versions: 3.0.0, trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-8958.patch
>
>
> This appears to be an issue with parsing a Windows-specific path.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9012) IPC Client sends wrong connection context

2012-11-05 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9012?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491049#comment-13491049
 ] 

Hadoop QA commented on HADOOP-9012:
---

{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12552178/HADOOP-9012.patch
  against trunk revision .

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+1 tests included{color}.  The patch appears to include 1 new 
or modified test files.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  The javadoc tool did not generate any 
warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 1.3.9) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common.

{color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1708//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1708//console

This message is automatically generated.

> IPC Client sends wrong connection context
> -
>
> Key: HADOOP-9012
> URL: https://issues.apache.org/jira/browse/HADOOP-9012
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: ipc
>Affects Versions: 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Attachments: HADOOP-9012.patch
>
>
> The IPC client will send the wrong connection context when asked to switch to 
> simple auth.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491037#comment-13491037
 ] 

Hadoop QA commented on HADOOP-9004:
---

{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12552164/HADOOP-9004.patch.007
  against trunk revision .

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+1 tests included{color}.  The patch appears to include 5 new 
or modified test files.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  The javadoc tool did not generate any 
warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 1.3.9) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs.

{color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1706//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1706//console

This message is automatically generated.

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9012) IPC Client sends wrong connection context

2012-11-05 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp updated HADOOP-9012:


Status: Patch Available  (was: Open)

> IPC Client sends wrong connection context
> -
>
> Key: HADOOP-9012
> URL: https://issues.apache.org/jira/browse/HADOOP-9012
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: ipc
>Affects Versions: 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Attachments: HADOOP-9012.patch
>
>
> The IPC client will send the wrong connection context when asked to switch to 
> simple auth.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9012) IPC Client sends wrong connection context

2012-11-05 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp updated HADOOP-9012:


Attachment: HADOOP-9012.patch

The connection context is built from the ugi based on the initial auth type.  
The users contained in the context are specific to the auth type.

The connection negotiation occurs which may switch the client to another auth 
(ex. simple).  The current code attempts to rebuild the connection context from 
the pre-existing context - which may not even contain the correct users.  Ex.  
A kerberos context only contains the effective user, and a token context 
contains no users.  A simple auth needs both effective and real user.  As a 
result, the doAs context in the RPC server may be wrong.

This patch simply builds the context after the connection is established, and 
always from the connection's original ugi.

> IPC Client sends wrong connection context
> -
>
> Key: HADOOP-9012
> URL: https://issues.apache.org/jira/browse/HADOOP-9012
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: ipc
>Affects Versions: 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Attachments: HADOOP-9012.patch
>
>
> The IPC client will send the wrong connection context when asked to switch to 
> simple auth.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9012) IPC Client sends wrong connection context

2012-11-05 Thread Daryn Sharp (JIRA)
Daryn Sharp created HADOOP-9012:
---

 Summary: IPC Client sends wrong connection context
 Key: HADOOP-9012
 URL: https://issues.apache.org/jira/browse/HADOOP-9012
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: ipc
Affects Versions: 3.0.0, 2.0.3-alpha
Reporter: Daryn Sharp
Assignee: Daryn Sharp


The IPC client will send the wrong connection context when asked to switch to 
simple auth.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8589) ViewFs tests fail when tests and home dirs are nested

2012-11-05 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490981#comment-13490981
 ] 

Hadoop QA commented on HADOOP-8589:
---

{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12552168/Hadoop-8589.patch
  against trunk revision .

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+1 tests included{color}.  The patch appears to include 6 new 
or modified test files.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  The javadoc tool did not generate any 
warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 1.3.9) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common.

{color:green}+1 contrib tests{color}.  The patch passed contrib unit tests.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1707//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/1707//console

This message is automatically generated.

> ViewFs tests fail when tests and home dirs are nested
> -
>
> Key: HADOOP-8589
> URL: https://issues.apache.org/jira/browse/HADOOP-8589
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs, test
>Affects Versions: 0.23.1, 2.0.0-alpha
>Reporter: Andrey Klochkov
>Assignee: Sanjay Radia
> Attachments: Hadoop-8589.patch, HADOOP-8589.patch, HADOOP-8589.patch, 
> hadoop-8589-sanjay.patch, HADOOP-8859.patch
>
>
> TestFSMainOperationsLocalFileSystem fails in case when the test root 
> directory is under the user's home directory, and the user's home dir is 
> deeper than 2 levels from /. This happens with the default 1-node 
> installation of Jenkins. 
> This is the failure log:
> {code}
> org.apache.hadoop.fs.FileAlreadyExistsException: Path /var already exists as 
> dir; cannot create link here
>   at org.apache.hadoop.fs.viewfs.InodeTree.createLink(InodeTree.java:244)
>   at org.apache.hadoop.fs.viewfs.InodeTree.(InodeTree.java:334)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystem$1.(ViewFileSystem.java:167)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystem.initialize(ViewFileSystem.java:167)
>   at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2094)
>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:79)
>   at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2128)
>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2110)
>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:290)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystemTestSetup.setupForViewFileSystem(ViewFileSystemTestSetup.java:76)
>   at 
> org.apache.hadoop.fs.viewfs.TestFSMainOperationsLocalFileSystem.setUp(TestFSMainOperationsLocalFileSystem.java:40)
> ...
> Standard Output
> 2012-07-11 22:07:20,239 INFO  mortbay.log (Slf4jLog.java:info(67)) - Home dir 
> base /var/lib
> {code}
> The reason for the failure is that the code tries to mount links for both 
> "/var" and "/var/lib", and it fails for the 2nd one as the "/var" is mounted 
> already.
> The fix was provided in HADOOP-8036 but later it was reverted in HADOOP-8129.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8975) TestFileContextResolveAfs fails on Windows

2012-11-05 Thread Brandon Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490980#comment-13490980
 ] 

Brandon Li commented on HADOOP-8975:


Sorry of the late reply.
The change of method isAbsolute() and isUriPathAbsolute() was added in JIRA 
HADOOP-4952.
{noformat}
 
+  /**
+   *  True if the path component (i.e. directory) of this URI is absolute.
+   */
+  public boolean isUriPathAbsolute() {
+int start = hasWindowsDrive(uri.getPath(), true) ? 3 : 0;
+return uri.getPath().startsWith(SEPARATOR, start);
+   }
+  
   /** True if the directory of this path is absolute. */
+  /**
+   * There is some ambiguity here. An absolute path is a slash
+   * relative name without a scheme or an authority.
+   * So either this method was incorrectly named or its
+   * implementation is incorrect.
+   */
   public boolean isAbsolute() {
-int start = hasWindowsDrive(uri.getPath(), true) ? 3 : 0;
-return uri.getPath().startsWith(SEPARATOR, start);
+ return isUriPathAbsolute();
   }
{noformat}

I guess the change was made to highlight the true method functionality, so the 
author made a new method isUriPathAbsolute() and added comments to 
isAbsolute(). However, the isAbsolute() was not removed since it might be out 
of the scope of that JIRA.

+1 to the new patch.


> TestFileContextResolveAfs fails on Windows
> --
>
> Key: HADOOP-8975
> URL: https://issues.apache.org/jira/browse/HADOOP-8975
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-8975-branch-trunk-win.patch, 
> HADOOP-8975-branch-trunk-win.patch
>
>
> This appears to be a Windows-specific path parsing issues.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8958) ViewFs:Non absolute mount name failures when running multiple tests on Windows

2012-11-05 Thread Brandon Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490968#comment-13490968
 ] 

Brandon Li commented on HADOOP-8958:


Sorry for the late reply. 
Looks like LogetTestMountPoint() might only be used in tests in the future and 
thus seems not necessary to put it in other source code until classes.

+1 to the patch.


> ViewFs:Non absolute mount name failures when running multiple tests on Windows
> --
>
> Key: HADOOP-8958
> URL: https://issues.apache.org/jira/browse/HADOOP-8958
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: viewfs
>Affects Versions: 3.0.0, trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-8958.patch
>
>
> This appears to be an issue with parsing a Windows-specific path.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-8589) ViewFs tests fail when tests and home dirs are nested

2012-11-05 Thread Sanjay Radia (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sanjay Radia updated HADOOP-8589:
-

Attachment: Hadoop-8589.patch

Updated patch that fixes the bug related to home dir being at root (e.g. /joe).
I ran the tests though 3 different home dirs: /Users/foo, /foo, /x/y/foo

> ViewFs tests fail when tests and home dirs are nested
> -
>
> Key: HADOOP-8589
> URL: https://issues.apache.org/jira/browse/HADOOP-8589
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs, test
>Affects Versions: 0.23.1, 2.0.0-alpha
>Reporter: Andrey Klochkov
>Assignee: Sanjay Radia
> Attachments: Hadoop-8589.patch, HADOOP-8589.patch, HADOOP-8589.patch, 
> hadoop-8589-sanjay.patch, HADOOP-8859.patch
>
>
> TestFSMainOperationsLocalFileSystem fails in case when the test root 
> directory is under the user's home directory, and the user's home dir is 
> deeper than 2 levels from /. This happens with the default 1-node 
> installation of Jenkins. 
> This is the failure log:
> {code}
> org.apache.hadoop.fs.FileAlreadyExistsException: Path /var already exists as 
> dir; cannot create link here
>   at org.apache.hadoop.fs.viewfs.InodeTree.createLink(InodeTree.java:244)
>   at org.apache.hadoop.fs.viewfs.InodeTree.(InodeTree.java:334)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystem$1.(ViewFileSystem.java:167)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystem.initialize(ViewFileSystem.java:167)
>   at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2094)
>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:79)
>   at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2128)
>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2110)
>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:290)
>   at 
> org.apache.hadoop.fs.viewfs.ViewFileSystemTestSetup.setupForViewFileSystem(ViewFileSystemTestSetup.java:76)
>   at 
> org.apache.hadoop.fs.viewfs.TestFSMainOperationsLocalFileSystem.setUp(TestFSMainOperationsLocalFileSystem.java:40)
> ...
> Standard Output
> 2012-07-11 22:07:20,239 INFO  mortbay.log (Slf4jLog.java:info(67)) - Home dir 
> base /var/lib
> {code}
> The reason for the failure is that the code tries to mount links for both 
> "/var" and "/var/lib", and it fails for the 2nd one as the "/var" is mounted 
> already.
> The fix was provided in HADOOP-8036 but later it was reverted in HADOOP-8129.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Stephen Chu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stephen Chu updated HADOOP-9004:


Status: Patch Available  (was: Open)

Submitting new patch.

Added _isExternalKdcRunning()_ to SecurityUtilTestHelper.java to detect if user 
running test has specified to use an external KDC.

TestUGIWithExternalKdc and TestSecureNameNodeWithExternalKdc are the 
counterparts to TestUGIWithSecurityOn and TestSecureNameNode, except the new 
tests use the external KDC. I don't think it'll be clean to merge these into 
one test, so I think separating them is fine for now.

I refactored SecureDataNodeStarter so that we can get the SecureResources 
within our unit tests.

I modified MiniDFSCluster so that it now actually checks to see if 
checkDataNodeAddrConfig was set to true (so that we can change the DataNodes to 
use low ports because secure DNs require ports < 1023). Also, while bringing up 
the DataNodes, if kerberos authentication is enabled, MiniDFSCluster will now 
get the SecureResources necessary to start the DN.

TestStartSecureDataNode brings up a 1 NameNode 1 DataNode MiniDFSCluster. 
However, the test will fail if not run as root because bringing up the secure 
DN requires root. This is a problem, and it won't work to give away root access 
in some jenkins env. I guess there has been past discussion on whether or not 
to have this requirement for starting the DN in dev environments. For now, I 
think it's still useful to have this test, even if it can't be run in most 
setups.



My plan is to continue to write more unit tests against a secure 
MiniDFSCluster, as we are missing a lot of unit test coverage against secure 
setups.


> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9004) Allow security unit tests to use external KDC

2012-11-05 Thread Stephen Chu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9004?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stephen Chu updated HADOOP-9004:


Attachment: HADOOP-9004.patch.007

> Allow security unit tests to use external KDC
> -
>
> Key: HADOOP-9004
> URL: https://issues.apache.org/jira/browse/HADOOP-9004
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security, test
>Affects Versions: 2.0.0-alpha
>Reporter: Stephen Chu
>Assignee: Stephen Chu
> Fix For: 3.0.0
>
> Attachments: HADOOP-9004.patch, HADOOP-9004.patch.007
>
>
> I want to add the option of allowing security-related unit tests to use an 
> external KDC.
> In HADOOP-8078, we add the ability to start and use an ApacheDS KDC for 
> security-related unit tests. It would be good to allow users to validate the 
> use of their own KDC, keytabs, and principals and to test different KDCs and 
> not rely on the ApacheDS KDC.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8953) Shell PathData parsing failures on Windows

2012-11-05 Thread Arpit Agarwal (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490900#comment-13490900
 ] 

Arpit Agarwal commented on HADOOP-8953:
---

Daryn has not yet commented on the latest patch.

> Shell PathData parsing failures on Windows
> --
>
> Key: HADOOP-8953
> URL: https://issues.apache.org/jira/browse/HADOOP-8953
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Arpit Agarwal
> Attachments: HADOOP-8953-branch-trunk-win-3.patch, 
> HADOOP-8953-branch-trunk-win-4.patch, HADOOP-8953-branch-trunk-win-6.patch, 
> HADOOP-8953-branch-trunk-win.patch
>
>
> Several test suites fail on Windows, apparently due to Windows-specific path 
> parsing bugs in PathData.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8953) Shell PathData parsing failures on Windows

2012-11-05 Thread Suresh Srinivas (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490897#comment-13490897
 ] 

Suresh Srinivas commented on HADOOP-8953:
-

bq.  I'm working offline with Arpit and Chris to see if there's a simpler 
solution.
Is this done? If there are no further comments, I will commit this patch soon.

> Shell PathData parsing failures on Windows
> --
>
> Key: HADOOP-8953
> URL: https://issues.apache.org/jira/browse/HADOOP-8953
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Arpit Agarwal
> Attachments: HADOOP-8953-branch-trunk-win-3.patch, 
> HADOOP-8953-branch-trunk-win-4.patch, HADOOP-8953-branch-trunk-win-6.patch, 
> HADOOP-8953-branch-trunk-win.patch
>
>
> Several test suites fail on Windows, apparently due to Windows-specific path 
> parsing bugs in PathData.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8953) Shell PathData parsing failures on Windows

2012-11-05 Thread Sanjay Radia (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490890#comment-13490890
 ] 

Sanjay Radia commented on HADOOP-8953:
--

+1

> Shell PathData parsing failures on Windows
> --
>
> Key: HADOOP-8953
> URL: https://issues.apache.org/jira/browse/HADOOP-8953
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Arpit Agarwal
> Attachments: HADOOP-8953-branch-trunk-win-3.patch, 
> HADOOP-8953-branch-trunk-win-4.patch, HADOOP-8953-branch-trunk-win-6.patch, 
> HADOOP-8953-branch-trunk-win.patch
>
>
> Several test suites fail on Windows, apparently due to Windows-specific path 
> parsing bugs in PathData.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Assigned] (HADOOP-8981) TestMetricsSystemImpl fails on Windows

2012-11-05 Thread Arpit Agarwal (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8981?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Arpit Agarwal reassigned HADOOP-8981:
-

Assignee: Arpit Agarwal  (was: Chris Nauroth)

> TestMetricsSystemImpl fails on Windows
> --
>
> Key: HADOOP-8981
> URL: https://issues.apache.org/jira/browse/HADOOP-8981
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: metrics
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Arpit Agarwal
>
> The test is failing on an expected mock interaction.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-8615) EOFException in DecompressorStream.java needs to be more verbose

2012-11-05 Thread Andy Isaacson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-8615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490842#comment-13490842
 ] 

Andy Isaacson commented on HADOOP-8615:
---

bq. Please let me know for any feedback.

Sorry for the delay!

A few more whitespace fixups:
- make sure {{) throws}} has a space between the ) and "throws". (2 examples in 
this patch)
- still a few argument lists missing a space after "," for example {{return 
createInputStream(in, null,fileName);}}.
- also a few argument lists with extra spaces before "," for example 
{{Decompressor decompressor , String fileName}}
- extra space in {{protected  String fileName}}
- extra space in {{this.fileName =  fileName}}
- missing spaces in {{\+"file = "\+this.fileName}}, always put spaces on both 
sides of "\+" and other operators. also we generally put the "+" on the 
previous line for a string continuation like this one.
- missing space in {{if ((b1 | b2 | b3 | b4) < 0\)\{}} before "{"
- missing space in {{String fileName ="fileName";}} after "="

Thanks again for working on this enhancement!

> EOFException in DecompressorStream.java needs to be more verbose
> 
>
> Key: HADOOP-8615
> URL: https://issues.apache.org/jira/browse/HADOOP-8615
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: io
>Affects Versions: 0.20.2
>Reporter: Jeff Lord
>  Labels: patch
> Attachments: HADOOP-8615.patch, HADOOP-8615-release-0.20.2.patch, 
> HADOOP-8615-ver2.patch
>
>
> In ./src/core/org/apache/hadoop/io/compress/DecompressorStream.java
> The following exception should at least pass back the file that it encounters 
> this error in relation to:
>   protected void getCompressedData() throws IOException {
> checkStream();
> int n = in.read(buffer, 0, buffer.length);
> if (n == -1) {
>   throw new EOFException("Unexpected end of input stream");
> }
> This would help greatly to debug bad/corrupt files.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9010) Map UGI authenticationMethod to RPC authMethod

2012-11-05 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490812#comment-13490812
 ] 

Hudson commented on HADOOP-9010:


Integrated in Hadoop-trunk-Commit #2955 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/2955/])
HADOOP-9010. Map UGI authenticationMethod to RPC authMethod (daryn via 
bobby) (Revision 1405910)

 Result = SUCCESS
bobby : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1405910
Files : 
* /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUserGroupInformation.java


> Map UGI authenticationMethod to RPC authMethod
> --
>
> Key: HADOOP-9010
> URL: https://issues.apache.org/jira/browse/HADOOP-9010
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 0.23.0, 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Fix For: 3.0.0, 2.0.3-alpha
>
> Attachments: HADOOP-9010.patch
>
>
> The UGI's authenticationMethod needs a forward mapping to the RPC/SASL 
> authMethod.  This will allow for the RPC client to eventually use the UGI's 
> authenticationMethod to derive the authMethod instead of assuming security on 
> is kerberos and security off is simple.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9010) Map UGI authenticationMethod to RPC authMethod

2012-11-05 Thread Robert Joseph Evans (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490805#comment-13490805
 ] 

Robert Joseph Evans commented on HADOOP-9010:
-

Sorry I got ahead of myself +1. looks good to me.

> Map UGI authenticationMethod to RPC authMethod
> --
>
> Key: HADOOP-9010
> URL: https://issues.apache.org/jira/browse/HADOOP-9010
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 0.23.0, 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Fix For: 3.0.0, 2.0.3-alpha
>
> Attachments: HADOOP-9010.patch
>
>
> The UGI's authenticationMethod needs a forward mapping to the RPC/SASL 
> authMethod.  This will allow for the RPC client to eventually use the UGI's 
> authenticationMethod to derive the authMethod instead of assuming security on 
> is kerberos and security off is simple.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9009) Add SecurityUtil methods to get/set authentication method

2012-11-05 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490803#comment-13490803
 ] 

Hudson commented on HADOOP-9009:


Integrated in Hadoop-trunk-Commit #2954 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/2954/])
HADOOP-9009. Add SecurityUtil methods to get/set authentication method 
(daryn via bobby) (Revision 1405904)

 Result = SUCCESS
bobby : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1405904
Files : 
* /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/UserGroupInformation.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/MiniRPCBenchmark.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestRPC.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestDoAsEffectiveUser.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestSecurityUtil.java
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestUGIWithSecurityOn.java


> Add SecurityUtil methods to get/set authentication method
> -
>
> Key: HADOOP-9009
> URL: https://issues.apache.org/jira/browse/HADOOP-9009
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 2.0.0-alpha, 3.0.0
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Fix For: 3.0.0, 2.0.3-alpha
>
> Attachments: HADOOP-9009.patch
>
>
> The authentication method is handled as a string when an enum is available.  
> Adding methods to get/set the conf value based on the enum will simplify 
> adding new SASL auths such as PLAIN.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9009) Add SecurityUtil methods to get/set authentication method

2012-11-05 Thread Robert Joseph Evans (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Joseph Evans updated HADOOP-9009:


   Resolution: Fixed
Fix Version/s: 2.0.3-alpha
   3.0.0
   Status: Resolved  (was: Patch Available)

Thanks Daryn I put this into trunk and branch-2

> Add SecurityUtil methods to get/set authentication method
> -
>
> Key: HADOOP-9009
> URL: https://issues.apache.org/jira/browse/HADOOP-9009
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 2.0.0-alpha, 3.0.0
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Fix For: 3.0.0, 2.0.3-alpha
>
> Attachments: HADOOP-9009.patch
>
>
> The authentication method is handled as a string when an enum is available.  
> Adding methods to get/set the conf value based on the enum will simplify 
> adding new SASL auths such as PLAIN.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9010) Map UGI authenticationMethod to RPC authMethod

2012-11-05 Thread Robert Joseph Evans (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Joseph Evans updated HADOOP-9010:


   Resolution: Fixed
Fix Version/s: 2.0.3-alpha
   3.0.0
   Status: Resolved  (was: Patch Available)

Thanks Daryn,

I pulled this into branch-2 and trunk.

> Map UGI authenticationMethod to RPC authMethod
> --
>
> Key: HADOOP-9010
> URL: https://issues.apache.org/jira/browse/HADOOP-9010
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 0.23.0, 3.0.0, 2.0.3-alpha
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Fix For: 3.0.0, 2.0.3-alpha
>
> Attachments: HADOOP-9010.patch
>
>
> The UGI's authenticationMethod needs a forward mapping to the RPC/SASL 
> authMethod.  This will allow for the RPC client to eventually use the UGI's 
> authenticationMethod to derive the authMethod instead of assuming security on 
> is kerberos and security off is simple.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (HADOOP-9009) Add SecurityUtil methods to get/set authentication method

2012-11-05 Thread Robert Joseph Evans (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13490791#comment-13490791
 ] 

Robert Joseph Evans commented on HADOOP-9009:
-

Patch looks good to me. I am +1 on it. I'll check it in.

> Add SecurityUtil methods to get/set authentication method
> -
>
> Key: HADOOP-9009
> URL: https://issues.apache.org/jira/browse/HADOOP-9009
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: fs, security
>Affects Versions: 2.0.0-alpha, 3.0.0
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
> Attachments: HADOOP-9009.patch
>
>
> The authentication method is handled as a string when an enum is available.  
> Adding methods to get/set the conf value based on the enum will simplify 
> adding new SASL auths such as PLAIN.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Resolved] (HADOOP-8601) Extend TestShell to cover Windows shell commands

2012-11-05 Thread Chuan Liu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chuan Liu resolved HADOOP-8601.
---

   Resolution: Implemented
Fix Version/s: 1-win

With HADOOP-8972, we will test Windows shell commands with a separate test: 
TestWinUtils, and this JIRA is no longer needed.

> Extend TestShell to cover Windows shell commands
> 
>
> Key: HADOOP-8601
> URL: https://issues.apache.org/jira/browse/HADOOP-8601
> Project: Hadoop Common
>  Issue Type: Test
>Affects Versions: 1-win
>Reporter: Chuan Liu
>Assignee: Chuan Liu
> Fix For: 1-win
>
>
> The existing unit test only covers Linux shell commands. Since we begin to 
> support Windows and use completely different commands on Windows, it make 
> sense to extend TestShell to cover Windows use cases.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Assigned] (HADOOP-9008) Building hadoop tarball fails on Windows

2012-11-05 Thread Chris Nauroth (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9008?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Nauroth reassigned HADOOP-9008:
-

Assignee: Chris Nauroth

> Building hadoop tarball fails on Windows
> 
>
> Key: HADOOP-9008
> URL: https://issues.apache.org/jira/browse/HADOOP-9008
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: trunk-win
>Reporter: Ivan Mitic
>Assignee: Chris Nauroth
>
> Trying to build Hadoop trunk tarball via {{mvn package -Pdist -DskipTests 
> -Dtar}} fails on Windows.
> Build system generates sh scripts that execute build tasks what does not work 
> on Windows without Cygwin. It might make sense to apply the same pattern as 
> in HADOOP-8924, and use python instead of sh.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Updated] (HADOOP-9011) saveVersion.py does not include branch in version annotation

2012-11-05 Thread Chris Nauroth (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Nauroth updated HADOOP-9011:
--

Attachment: HADOOP-9011-branch-trunk-win.patch

This patch adds the branch to the version annotation.  This fixes 
TestYarnVersionInfo and the runtime errors.

> saveVersion.py does not include branch in version annotation
> 
>
> Key: HADOOP-9011
> URL: https://issues.apache.org/jira/browse/HADOOP-9011
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: trunk-win
>Reporter: Chris Nauroth
>Assignee: Chris Nauroth
> Attachments: HADOOP-9011-branch-trunk-win.patch
>
>
> HADOOP-8924 created saveVersion.py on branch-trunk-win.  Unlike 
> saveVersion.sh on trunk, it did not include the branch attribute in the 
> version annotation.  This causes errors at runtime for anything that tries to 
> read the annotation via VersionInfo.  This also causes a unit test failure in 
> TestYarnVersionInfo.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira