[jira] [Created] (HADOOP-9019) KerberosAuthenticator.doSpnegoSequence(..) should create a HTTP principal with hostname everytime
Vinay created HADOOP-9019: - Summary: KerberosAuthenticator.doSpnegoSequence(..) should create a HTTP principal with hostname everytime Key: HADOOP-9019 URL: https://issues.apache.org/jira/browse/HADOOP-9019 Project: Hadoop Common Issue Type: Bug Reporter: Vinay in KerberosAuthenticator.doSpnegoSequence(..) following line of code will just create a principal of the form HTTP/host, {code}String servicePrincipal = KerberosUtil.getServicePrincipal(HTTP, KerberosAuthenticator.this.url.getHost());{code} but uri.getHost() is not sure of always getting hostname. If uri contains IP, then it just returns IP. For SPNEGO authentication principal should always be created with hostname. This code should be something like this, which will look /etc/hosts to get hostname {code}String hostname = InetAddress.getByName( KerberosAuthenticator.this.url.getHost()).getHostName(); String servicePrincipal = KerberosUtil.getServicePrincipal(HTTP, hostname);{code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
Build failed in Jenkins: Hadoop-Common-0.23-Build #426
See https://builds.apache.org/job/Hadoop-Common-0.23-Build/426/changes Changes: [acmurthy] Merge -c 1406834 from trunk to branch-2 to fix YARN-201. Fix CapacityScheduler to be less conservative for starved off-switch requests. Contributed by Jason Lowe. -- [...truncated 11014 lines...] [INFO] [INFO] --- maven-clover2-plugin:3.0.5:clover (clover) @ hadoop-auth --- [INFO] Using /default-clover-report descriptor. [INFO] Using Clover report descriptor: /tmp/mvn4829612330517839291resource [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Clover is enabled with initstring 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/hadoop-coverage.db' [WARNING] Clover historical directory [https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/history] does not exist, skipping Clover historical report generation ([https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover]) [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Loading coverage database from: 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/hadoop-coverage.db' [INFO] Writing HTML report to 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover' [INFO] Done. Processed 4 packages in 1100ms (275ms per package). [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Clover is enabled with initstring 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/hadoop-coverage.db' [WARNING] Clover historical directory [https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/history] does not exist, skipping Clover historical report generation ([https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/clover.xml]) [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Loading coverage database from: 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/hadoop-coverage.db' [INFO] Writing report to 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth/target/clover/clover.xml' [INFO] [INFO] [INFO] Building Apache Hadoop Auth Examples 0.23.5-SNAPSHOT [INFO] [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-auth-examples --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-auth-examples --- [INFO] Wrote classpath file 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/classes/mrapp-generated-classpath'. [INFO] [INFO] --- maven-clover2-plugin:3.0.5:setup (setup) @ hadoop-auth-examples --- [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Creating new database at 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/clover/hadoop-coverage.db'. [INFO] Processing files at 1.6 source level. [INFO] Clover all over. Instrumented 3 files (1 package). [INFO] Elapsed time = 0.016 secs. (187.5 files/sec, 17,687.5 srclines/sec) [INFO] No Clover instrumentation done on source files in: [https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-auth-examples/src/test/java] as no matching sources files found [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-auth-examples --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile
[jira] [Resolved] (HADOOP-8975) TestFileContextResolveAfs fails on Windows
[ https://issues.apache.org/jira/browse/HADOOP-8975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Suresh Srinivas resolved HADOOP-8975. - Resolution: Fixed Fix Version/s: trunk-win Hadoop Flags: Reviewed Committed the patch to branch-trunk-win. Thank you Chris. TestFileContextResolveAfs fails on Windows -- Key: HADOOP-8975 URL: https://issues.apache.org/jira/browse/HADOOP-8975 Project: Hadoop Common Issue Type: Bug Components: fs Affects Versions: trunk-win Reporter: Chris Nauroth Assignee: Chris Nauroth Fix For: trunk-win Attachments: HADOOP-8975-branch-trunk-win.patch, HADOOP-8975-branch-trunk-win.patch This appears to be a Windows-specific path parsing issues. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9020) Add a SASL PLAIN server
Daryn Sharp created HADOOP-9020: --- Summary: Add a SASL PLAIN server Key: HADOOP-9020 URL: https://issues.apache.org/jira/browse/HADOOP-9020 Project: Hadoop Common Issue Type: Sub-task Components: ipc, security Affects Versions: 2.0.0-alpha, 0.23.0, 3.0.0 Reporter: Daryn Sharp Assignee: Daryn Sharp Java includes a SASL PLAIN client but not a server. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9021) Enforce configured SASL method on the server
Daryn Sharp created HADOOP-9021: --- Summary: Enforce configured SASL method on the server Key: HADOOP-9021 URL: https://issues.apache.org/jira/browse/HADOOP-9021 Project: Hadoop Common Issue Type: Sub-task Components: ipc, security Affects Versions: 2.0.0-alpha, 0.23.0, 3.0.0 Reporter: Daryn Sharp Assignee: Daryn Sharp The RPC needs to restrict itself to only using the configured SASL method. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9022) Hadoop distcp tool fails to copy file if -m 0 specified
Haiyang Jiang created HADOOP-9022: - Summary: Hadoop distcp tool fails to copy file if -m 0 specified Key: HADOOP-9022 URL: https://issues.apache.org/jira/browse/HADOOP-9022 Project: Hadoop Common Issue Type: Bug Affects Versions: 0.23.4, 0.23.3, 0.23.1 Reporter: Haiyang Jiang When trying to copy file using distcp on H23, if -m 0 is specified, distcp will just spawn 0 mapper tasks and the file will not be copied. But this used to work before H23, even when -m 0 specified, distcp will always copy the files. Checked the code of DistCp.java Before the rewrite, it set the number maps at least to 1 job.setNumMapTasks(Math.max(numMaps, 1)); But in the newest code, it just takes the input from user: job.getConfiguration().set(JobContext.NUM_MAPS, String.valueOf(inputOptions.getMaxMaps())); -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira