Build failed in Jenkins: Hadoop-Common-trunk #859

2013-08-13 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-trunk/859/changes

Changes:

[sandy] MAPREDUCE-5454. TestDFSIO fails intermittently on JDK7 (Karthik 
Kambatla via Sandy Ryza)

[tucu] HADOOP-9848. Create a MiniKDC for use with security testing. (ywskycn 
via tucu)

[tucu] HADOOP-9845. Update protobuf to 2.5 from 2.4.x. (tucu)

[wang] Fix CHANGES.txt for HADOOP-9847

[wang] TestGlobPath symlink tests fail to cleanup properly. (cmccabe via wang)

[kihwal] HADOOP-9583. test-patch gives +1 despite build failure when running 
tests. Contributed by Jason Lowe.

--
[...truncated 55355 lines...]
Adding reference: maven.plugin.classpath
Adding reference: maven.project
Adding reference: maven.project.helper
Adding reference: maven.local.repository
[DEBUG] Initialize Maven Ant Tasks
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.6/maven-antrun-plugin-1.6.jar!/org/apache/maven/ant/tasks/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.6/maven-antrun-plugin-1.6.jar!/org/apache/maven/ant/tasks/antlib.xml
 from a zip file
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar!/org/apache/tools/ant/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar!/org/apache/tools/ant/antlib.xml
 from a zip file
Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader 
(parentFirst)
 +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask
Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent 
loader (parentFirst)
 +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask
Setting project property: test.build.dir - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir
Setting project property: test.exclude.pattern - _
Setting project property: hadoop.assemblies.version - 3.0.0-SNAPSHOT
Setting project property: test.exclude - _
Setting project property: distMgmtSnapshotsId - apache.snapshots.https
Setting project property: project.build.sourceEncoding - UTF-8
Setting project property: java.security.egd - file:///dev/urandom
Setting project property: distMgmtSnapshotsUrl - 
https://repository.apache.org/content/repositories/snapshots
Setting project property: distMgmtStagingUrl - 
https://repository.apache.org/service/local/staging/deploy/maven2
Setting project property: test.build.data - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir
Setting project property: commons-daemon.version - 1.0.13
Setting project property: hadoop.common.build.dir - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/../../hadoop-common-project/hadoop-common/target
Setting project property: testsThreadCount - 4
Setting project property: maven.test.redirectTestOutputToFile - true
Setting project property: jdiff.version - 1.0.9
Setting project property: build.platform - Linux-i386-32
Setting project property: distMgmtStagingName - Apache Release Distribution 
Repository
Setting project property: project.reporting.outputEncoding - UTF-8
Setting project property: protobuf.version - 2.5.0
Setting project property: failIfNoTests - false
Setting project property: distMgmtStagingId - apache.staging.https
Setting project property: distMgmtSnapshotsName - Apache Development Snapshot 
Repository
Setting project property: ant.file - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml
[DEBUG] Setting properties with prefix: 
Setting project property: project.groupId - org.apache.hadoop
Setting project property: project.artifactId - hadoop-common-project
Setting project property: project.name - Apache Hadoop Common Project
Setting project property: project.description - Apache Hadoop Common Project
Setting project property: project.version - 3.0.0-SNAPSHOT
Setting project property: project.packaging - pom
Setting project property: project.build.directory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target
Setting project property: project.build.outputDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/classes
Setting project property: project.build.testOutputDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-classes
Setting project property: project.build.sourceDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/main/java
Setting project property: project.build.testSourceDirectory - 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/test/java
Setting project property: localRepository -id: local
  url: file:///home/jenkins/.m2/repository/
   layout: none
Setting 

[jira] [Created] (HADOOP-9867) org.apache.hadoop.mapred.LineRecordReader does not handle multibyte record delimiters well

2013-08-13 Thread Kris Geusebroek (JIRA)
Kris Geusebroek created HADOOP-9867:
---

 Summary: org.apache.hadoop.mapred.LineRecordReader does not handle 
multibyte record delimiters well
 Key: HADOOP-9867
 URL: https://issues.apache.org/jira/browse/HADOOP-9867
 Project: Hadoop Common
  Issue Type: Bug
  Components: io
Affects Versions: 0.20.2
 Environment: CDH3U2 Redhat linux 5.7
Reporter: Kris Geusebroek


Having defined a recorddelimiter of multiple bytes in a new InputFileFormat 
sometimes has the effect of skipping records from the input.

This happens when the input splits are split off just after a recordseparator. 
Starting point for the next split would be non zero and skipFirstLine would be 
true. A seek into the file is done to start - 1 and the text until the first 
recorddelimiter is ignored (due to the presumption that this record is already 
handled by the previous maptask). Since the re ord delimiter is multibyte the 
seek only got the last byte of the delimiter into scope and its not recognized 
as a full delimiter. So the text is skipped until the next delimiter (ignoring 
a full record!!)


--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9868) Server must not advertise kerberos realm

2013-08-13 Thread Daryn Sharp (JIRA)
Daryn Sharp created HADOOP-9868:
---

 Summary: Server must not advertise kerberos realm
 Key: HADOOP-9868
 URL: https://issues.apache.org/jira/browse/HADOOP-9868
 Project: Hadoop Common
  Issue Type: Bug
  Components: ipc
Affects Versions: 3.0.0, 2.1.1-beta
Reporter: Daryn Sharp
Assignee: Daryn Sharp
Priority: Blocker


HADOOP-9789 broke kerberos authentication by making the RPC server advertise 
the kerberos service principal realm.  SASL clients and servers do not support 
specifying a realm, so it must be removed from the advertisement.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Resolved] (HADOOP-9789) Support server advertised kerberos principals

2013-08-13 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp resolved HADOOP-9789.
-

Resolution: Fixed

Will be fixed by HADOOP-9868.

 Support server advertised kerberos principals
 -

 Key: HADOOP-9789
 URL: https://issues.apache.org/jira/browse/HADOOP-9789
 Project: Hadoop Common
  Issue Type: New Feature
  Components: ipc, security
Affects Versions: 2.0.0-alpha, 3.0.0
Reporter: Daryn Sharp
Assignee: Daryn Sharp
Priority: Critical
 Fix For: 3.0.0, 2.1.1-beta

 Attachments: HADOOP-9789.2.patch, HADOOP-9789.patch, 
 HADOOP-9789.patch, hadoop-ojoshi-datanode-HW10351.local.log, 
 hadoop-ojoshi-namenode-HW10351.local.log


 The RPC client currently constructs the kerberos principal based on the a 
 config value, usually with an _HOST substitution.  This means the service 
 principal must match the hostname the client is using to connect.  This 
 causes problems:
 * Prevents using HA with IP failover when the servers have distinct 
 principals from the failover hostname
 * Prevents clients from being able to access a service bound to multiple 
 interfaces.  Only the interface that matches the server's principal may be 
 used.
 The client should be able to use the SASL advertised principal (HADOOP-9698), 
 with appropriate safeguards, to acquire the correct service ticket.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9869) Configuration.getSocketAddr() should use getTrimmed()

2013-08-13 Thread Steve Loughran (JIRA)
Steve Loughran created HADOOP-9869:
--

 Summary:  Configuration.getSocketAddr() should use getTrimmed()
 Key: HADOOP-9869
 URL: https://issues.apache.org/jira/browse/HADOOP-9869
 Project: Hadoop Common
  Issue Type: Improvement
  Components: conf
Affects Versions: 3.0.0, 2.1.0-beta, 1.3.0
Reporter: Steve Loughran
Priority: Minor


YARN-1059 has shown that the hostname:port string used for the address of 
things like the RM isn't trimmed before its parsed, leading to errors that 
aren't that obvious. 

We should trim it -it's clearly not going to break any existing (valid) 
configurations

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9870) Mixed configurations for JVM -Xmx in hadoop command

2013-08-13 Thread Wei Yan (JIRA)
Wei Yan created HADOOP-9870:
---

 Summary: Mixed configurations for JVM -Xmx in hadoop command
 Key: HADOOP-9870
 URL: https://issues.apache.org/jira/browse/HADOOP-9870
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Wei Yan


When we use hadoop command to launch a class, there are two places setting the 
-Xmx configuration.

*1*. The first place is located in file 
{{hadoop-common-project/hadoop-common/src/main/bin/hadoop}}.
{code}
exec $JAVA $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS $@
{code}
Here $JAVA_HEAP_MAX is configured in hadoop-config.sh 
({{hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh}}). The 
default value is -Xmx1000m.

*2*. The second place is set with $HADOOP_OPTS in file 
{{hadoop-common-project/hadoop-common/src/main/bin/hadoop}}.
{code}
HADOOP_OPTS=$HADOOP_OPTS $HADOOP_CLIENT_OPTS
{code}
Here $HADOOP_CLIENT_OPTS is set in hadoop-env.sh 
({{hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh}})
{code}
export HADOOP_CLIENT_OPTS=-Xmx512m $HADOOP_CLIENT_OPTS
{code}

Currently the final default java command looks like:
{code}java -Xmx1000m  -Xmx512m CLASS_NAME ARGUMENTS{code}

And if users also specify the -Xmx in the $HADOOP_CLIENT_OPTS, there will be 
three -Xmx configurations. 

The hadoop setup tutorial only discusses hadoop-env.sh, and it looks that users 
should not make any change in hadoop-config.sh.

We should let hadoop smart to choose the right one before launching the java 
command, instead of leaving for jvm to make the decision.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Alejandro Abdelnur
There is no indication that protoc 2.5.0 is breaking anything.

Hadoop-trunk builds have been failing way before 1/2 way with:

---


[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test
(default-test) on project hadoop-yarn-client: ExecutionException;
nested exception is java.util.concurrent.ExecutionException:
java.lang.RuntimeException: The forked VM terminated without saying
properly goodbye. VM crash or System.exit called ? - [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test
(default-test) on project hadoop-yarn-client: ExecutionException;
nested exception is java.util.concurrent.ExecutionException:
java.lang.RuntimeException: The forked VM terminated without saying
properly goodbye. VM crash or System.exit called ?

---


The Hadoop-trunk #480 build failed with a JVM abort in a testcase towards
the end of mapreduce tests.

Until then there were no failures at all.

I've increased heap size and tried a second run and the failure was earlier.

I've looked a Hadoop-trunk builds prior to the HADOOP-9845 and it has been
failing the same way in all the kept builds.

We need to fix Hadoop-trunk builds independently of this.

Any objection to commit HADOOP-9845 to branch-2 and the 2.1.0-beta branches
to get all the other jenkins jobs working?

I'll wait till tomorrow morning before proceeding.

Thx




On Mon, Aug 12, 2013 at 8:35 PM, Alejandro Abdelnur t...@cloudera.comwrote:

 Jenkins is running a full test run on trunk using protoc 2.5.0.

   https://builds.apache.org/job/Hadoop-trunk/480

 And it seems go be going just fine.

 If everything looks OK, I'm planing to backport HADOOP-9845 to the
 2.1.0-beta branch midday PST tomorrow. This will normalize all builds
 failures do the protoc mismatch.

 Thanks.

 Alejandro


 On Mon, Aug 12, 2013 at 5:53 PM, Alejandro Abdelnur t...@cloudera.comwrote:

 shooting to get it i n for 2.1.0.

 at moment is in trunk till the nightly finishes. then we'll decide

 in the mean time, you can have multiple versions installed in diff dirs
 and set the right one in the path

 thx

 Alejandro
 (phone typing)

 On Aug 12, 2013, at 17:47, Konstantin Shvachko shv.had...@gmail.com
 wrote:

  Ok. After installing protobuf 2.5.0 I can compile trunk.
  But now I cannot compile Hadoop-2 branches. None of them.
  So if I switch between branches I need to reinstall protobuf?
 
  Is there a consensus about going towards protobuf 2.5.0 upgrade in ALL
  versions?
  I did not get definite impression there is.
  If not it could be a pretty big disruption.
 
  Thanks,
  --Konst
 
 
 
  On Mon, Aug 12, 2013 at 3:19 PM, Alejandro Abdelnur t...@cloudera.com
 wrote:
 
  I've just committed HADOOP-9845 to trunk (only trunk at the moment).
 
  To build trunk now you need protoc 2.5.0 (the build will fail with a
  warning if you don't have it).
 
  We'd propagate this to the 2 branches once the precommit build is back
 to
  normal and see things are OK.
 
  Thanks.
 
 
  On Mon, Aug 12, 2013 at 2:57 PM, Alejandro Abdelnur t...@cloudera.com
  wrote:
 
  About to commit HADOOP-9845 to trunk, in 5 mins. This will make trunk
 use
  protoc 2.5.0.
 
  thx
 
 
  On Mon, Aug 12, 2013 at 11:47 AM, Giridharan Kesavan 
  gkesa...@hortonworks.com wrote:
 
  I can take care of re-installing 2.4 and installing 2.5 in a
 different
  location. This would fix 2.0 branch builds as well.
  Thoughts?
 
  -Giri
 
 
  On Mon, Aug 12, 2013 at 11:37 AM, Alejandro Abdelnur 
 t...@cloudera.com
  wrote:
 
  Giri,
 
  first of all, thanks for installing protoc 2.5.0.
 
  I didn't know we were installing them as the only version and not
  driven by
  env/path settings.
 
  Now we have a bit of a problem, precommit builds are broken because
 of
  mismatch of protoc (2.5.0) and protobuf JAR( 2.4.1).
 
  We have to options:
 
  1* commit HADOOP-9845 that will bring protobuf to 2.5.0 and iron out
  any
  follow up issues.
  2* reinstall protoc 2.4.1 in the jenkins machines and have 2.4.1 and
  2.5.0
  coexisting
 
  My take would be to commit HADOOP-9845 in trunk, iron out any issues
  an
  then merge it to the other branches.
 
  We need to sort this out quickly as precommits are not working.
 
  I'll wait till 3PM today  for objections to option #1, if none I'll
  commit
  it to trunk.
 
  Thanks.
 
  Alejandro
 
 
 
  On Mon, Aug 12, 2013 at 11:30 AM, Giridharan Kesavan 
  gkesa...@hortonworks.com wrote:
 
  Like I said protoc is upgraded from 2.4 to 2.5. 2.5 is in the
  default
  path.
  If we still need 2.4 I may have to install it. Let me know
 
  -Giri
 
 
  On Sat, Aug 10, 2013 at 7:01 AM, Alejandro Abdelnur 
  t...@cloudera.com
  wrote:
 
  thanks giri, how do we set 2.4 or 2.5., what is the path to both
  so
  we
  can
  use and env to set it in the jobs?
 
  thx
 
  Alejandro
  (phone typing)
 
  On Aug 9, 2013, at 23:10, Giridharan Kesavan 
  gkesa...@hortonworks.com
 
  wrote:

[jira] [Created] (HADOOP-9871) Fix intermittent findbug warnings in DefaultMetricsSystem

2013-08-13 Thread Luke Lu (JIRA)
Luke Lu created HADOOP-9871:
---

 Summary: Fix intermittent findbug warnings in DefaultMetricsSystem
 Key: HADOOP-9871
 URL: https://issues.apache.org/jira/browse/HADOOP-9871
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Luke Lu
Assignee: Junping Du
Priority: Minor


Findbugs sometimes (not always) picks up warnings from DefaultMetricsSystem due 
to some of the fields not being transient for serializable class 
(DefaultMetricsSystem is an Enum (which is serializable)). 

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Alejandro Abdelnur
Steve, this is a version issue, if you get protoc 2.5.0 in your PATH things
will  work.

Apologies for the the hiccups until we get all this sorted out, we had some
miscommunication on how to install protoc in the jenkins boxes and instead
having 2.4.1 and 2.5.0 side to side we got only 2.5.0.

By tomorrow we should have things mostly sorted out.

Thanks


On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran ste...@hortonworks.comwrote:

 On 13 August 2013 13:09, Alejandro Abdelnur t...@cloudera.com wrote:

  There is no indication that protoc 2.5.0 is breaking anything.
 


 clearly then this is not a stack trace:

 INFO]
 
 [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
 [INFO]
 
 Downloading:

 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloading:

 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloading:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloaded:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
 KB at 185.9 KB/sec)
 Downloading:

 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloading:

 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloading:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloaded:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
 KB at 7039.9 KB/sec)
 [INFO]
 [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
 ---
 [INFO] Deleting

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
 [INFO]
 [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
 ---
 [INFO] Executing tasks

 main:
 [mkdir] Created dir:

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
 [mkdir] Created dir:

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
 [INFO] Executed tasks
 [INFO]
 [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
 hadoop-common ---
 [WARNING] [protoc, --version] failed with error code 1
 [ERROR] protoc, could not get version
 [INFO]
 
 [INFO] Reactor Summary:
 [INFO]


 Assuming this is just a versioning issue, can you update the documentation
 in the wiki 

 http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
 be consistent with the current protobuf requirements. I do want to
 follow the instructions not just because I am lazy, but because I want to
 manually test the installation process itself

 Once that's done I will try to follow these instructions to get protobuf
 2.5 installed on my homebrew-managed mac.

 --
 CONFIDENTIALITY NOTICE
 NOTICE: This message is intended for the use of the individual or entity to
 which it is addressed and may contain information that is confidential,
 privileged and exempt from disclosure under applicable law. If the reader
 of this message is not the intended recipient, you are hereby notified that
 any printing, copying, dissemination, distribution, disclosure or
 forwarding of this communication is strictly prohibited. If you have
 received this communication in error, please contact the sender immediately
 and delete it from your system. Thank You.




-- 
Alejandro


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Luke Lu
I've verified that it's only a version issue (tested on a Ubuntu 12.04 VM)
: as long as you have 2.5.0 protoc, it works. BTW, the version check is a
little too strict. I had protobuf 2.5.1 (trunk) installed for 2.5 tests
and the exact check broke my build.


On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran ste...@hortonworks.comwrote:

 On 13 August 2013 13:09, Alejandro Abdelnur t...@cloudera.com wrote:

  There is no indication that protoc 2.5.0 is breaking anything.
 


 clearly then this is not a stack trace:

 INFO]
 
 [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
 [INFO]
 
 Downloading:

 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloading:

 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloading:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
 Downloaded:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
 KB at 185.9 KB/sec)
 Downloading:

 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloading:

 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloading:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
 Downloaded:

 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
 KB at 7039.9 KB/sec)
 [INFO]
 [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
 ---
 [INFO] Deleting

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
 [INFO]
 [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
 ---
 [INFO] Executing tasks

 main:
 [mkdir] Created dir:

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
 [mkdir] Created dir:

 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
 [INFO] Executed tasks
 [INFO]
 [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
 hadoop-common ---
 [WARNING] [protoc, --version] failed with error code 1
 [ERROR] protoc, could not get version
 [INFO]
 
 [INFO] Reactor Summary:
 [INFO]


 Assuming this is just a versioning issue, can you update the documentation
 in the wiki 

 http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
 be consistent with the current protobuf requirements. I do want to
 follow the instructions not just because I am lazy, but because I want to
 manually test the installation process itself

 Once that's done I will try to follow these instructions to get protobuf
 2.5 installed on my homebrew-managed mac.

 --
 CONFIDENTIALITY NOTICE
 NOTICE: This message is intended for the use of the individual or entity to
 which it is addressed and may contain information that is confidential,
 privileged and exempt from disclosure under applicable law. If the reader
 of this message is not the intended recipient, you are hereby notified that
 any printing, copying, dissemination, distribution, disclosure or
 forwarding of this communication is strictly prohibited. If you have
 received this communication in error, please contact the sender immediately
 and delete it from your system. Thank You.



Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Steve Loughran
On 13 August 2013 16:20, Alejandro Abdelnur t...@cloudera.com wrote:

 Steve, this is a version issue, if you get protoc 2.5.0 in your PATH things
 will  work.


I assume that, but as the YARN docs still talk about 0.24, they need to be
updated too



 Apologies for the the hiccups until we get all this sorted out, we had some
 miscommunication on how to install protoc in the jenkins boxes and instead
 having 2.4.1 and 2.5.0 side to side we got only 2.5.0.

 By tomorrow we should have things mostly sorted out.

 Thanks


 On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran ste...@hortonworks.com
 wrote:

  On 13 August 2013 13:09, Alejandro Abdelnur t...@cloudera.com wrote:
 
   There is no indication that protoc 2.5.0 is breaking anything.
  
 
 
  clearly then this is not a stack trace:
 
  INFO]
  
  [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
  [INFO]
  
  Downloading:
 
 
 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
  Downloading:
 
 
 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
  Downloading:
 
 
 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
  Downloaded:
 
 
 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
  KB at 185.9 KB/sec)
  Downloading:
 
 
 https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
  Downloading:
 
 
 http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
  Downloading:
 
 
 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
  Downloaded:
 
 
 http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
  KB at 7039.9 KB/sec)
  [INFO]
  [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
  ---
  [INFO] Deleting
 
 
 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
  [INFO]
  [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
  ---
  [INFO] Executing tasks
 
  main:
  [mkdir] Created dir:
 
 
 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
  [mkdir] Created dir:
 
 
 /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
  [INFO] Executed tasks
  [INFO]
  [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
  hadoop-common ---
  [WARNING] [protoc, --version] failed with error code 1
  [ERROR] protoc, could not get version
  [INFO]
  
  [INFO] Reactor Summary:
  [INFO]
 
 
  Assuming this is just a versioning issue, can you update the
 documentation
  in the wiki 
 
 
 http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
  be consistent with the current protobuf requirements. I do want to
  follow the instructions not just because I am lazy, but because I want to
  manually test the installation process itself
 
  Once that's done I will try to follow these instructions to get protobuf
  2.5 installed on my homebrew-managed mac.
 
  --
  CONFIDENTIALITY NOTICE
  NOTICE: This message is intended for the use of the individual or entity
 to
  which it is addressed and may contain information that is confidential,
  privileged and exempt from disclosure under applicable law. If the reader
  of this message is not the intended recipient, you are hereby notified
 that
  any printing, copying, dissemination, distribution, disclosure or
  forwarding of this communication is strictly prohibited. If you have
  received this communication in error, please contact the sender
 immediately
  and delete it from your system. Thank You.
 



 --
 Alejandro


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


[jira] [Resolved] (HADOOP-9346) Upgrading to protoc 2.5.0 fails the build

2013-08-13 Thread Harsh J (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Harsh J resolved HADOOP-9346.
-

Resolution: Duplicate

Thanks for pinging Ravi. I'd discussed with Alejandro that this could be 
closed. Looks like we added a dupe link but failed to close. Closing now.

 Upgrading to protoc 2.5.0 fails the build
 -

 Key: HADOOP-9346
 URL: https://issues.apache.org/jira/browse/HADOOP-9346
 Project: Hadoop Common
  Issue Type: Task
  Components: build
Affects Versions: 3.0.0
Reporter: Harsh J
Assignee: Harsh J
Priority: Minor
  Labels: protobuf
 Attachments: HADOOP-9346.patch


 Reported over the impala lists, one of the errors received is:
 {code}
 src/hadoop-common-project/hadoop-common/target/generated-sources/java/org/apache/hadoop/ha/proto/ZKFCProtocolProtos.java:[104,37]
  can not find symbol.
 symbol: class Parser
 location: package com.google.protobuf
 {code}
 Worth looking into as we'll eventually someday bump our protobuf deps.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira