Build failed in Jenkins: Hadoop-Common-trunk #859

2013-08-13 Thread Apache Jenkins Server
See 

Changes:

[sandy] MAPREDUCE-5454. TestDFSIO fails intermittently on JDK7 (Karthik 
Kambatla via Sandy Ryza)

[tucu] HADOOP-9848. Create a MiniKDC for use with security testing. (ywskycn 
via tucu)

[tucu] HADOOP-9845. Update protobuf to 2.5 from 2.4.x. (tucu)

[wang] Fix CHANGES.txt for HADOOP-9847

[wang] TestGlobPath symlink tests fail to cleanup properly. (cmccabe via wang)

[kihwal] HADOOP-9583. test-patch gives +1 despite build failure when running 
tests. Contributed by Jason Lowe.

--
[...truncated 55355 lines...]
Adding reference: maven.plugin.classpath
Adding reference: maven.project
Adding reference: maven.project.helper
Adding reference: maven.local.repository
[DEBUG] Initialize Maven Ant Tasks
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.6/maven-antrun-plugin-1.6.jar!/org/apache/maven/ant/tasks/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.6/maven-antrun-plugin-1.6.jar!/org/apache/maven/ant/tasks/antlib.xml
 from a zip file
parsing buildfile 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar!/org/apache/tools/ant/antlib.xml
 with URI = 
jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.1/ant-1.8.1.jar!/org/apache/tools/ant/antlib.xml
 from a zip file
Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader 
(parentFirst)
 +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask
Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent 
loader (parentFirst)
 +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask
Setting project property: test.build.dir -> 

Setting project property: test.exclude.pattern -> _
Setting project property: hadoop.assemblies.version -> 3.0.0-SNAPSHOT
Setting project property: test.exclude -> _
Setting project property: distMgmtSnapshotsId -> apache.snapshots.https
Setting project property: project.build.sourceEncoding -> UTF-8
Setting project property: java.security.egd -> file:///dev/urandom
Setting project property: distMgmtSnapshotsUrl -> 
https://repository.apache.org/content/repositories/snapshots
Setting project property: distMgmtStagingUrl -> 
https://repository.apache.org/service/local/staging/deploy/maven2
Setting project property: test.build.data -> 

Setting project property: commons-daemon.version -> 1.0.13
Setting project property: hadoop.common.build.dir -> 

Setting project property: testsThreadCount -> 4
Setting project property: maven.test.redirectTestOutputToFile -> true
Setting project property: jdiff.version -> 1.0.9
Setting project property: build.platform -> Linux-i386-32
Setting project property: distMgmtStagingName -> Apache Release Distribution 
Repository
Setting project property: project.reporting.outputEncoding -> UTF-8
Setting project property: protobuf.version -> 2.5.0
Setting project property: failIfNoTests -> false
Setting project property: distMgmtStagingId -> apache.staging.https
Setting project property: distMgmtSnapshotsName -> Apache Development Snapshot 
Repository
Setting project property: ant.file -> 

[DEBUG] Setting properties with prefix: 
Setting project property: project.groupId -> org.apache.hadoop
Setting project property: project.artifactId -> hadoop-common-project
Setting project property: project.name -> Apache Hadoop Common Project
Setting project property: project.description -> Apache Hadoop Common Project
Setting project property: project.version -> 3.0.0-SNAPSHOT
Setting project property: project.packaging -> pom
Setting project property: project.build.directory -> 

Setting project property: project.build.outputDirectory -> 

Setting project property: project.build.testOutputDirectory -> 

Setting project property: project.build.sourceDirectory -> 

Setting project property: project.build.testSourceDirectory -> 

Setting project property: localRepository ->id: local
  url: file:///hom

[jira] [Created] (HADOOP-9867) org.apache.hadoop.mapred.LineRecordReader does not handle multibyte record delimiters well

2013-08-13 Thread Kris Geusebroek (JIRA)
Kris Geusebroek created HADOOP-9867:
---

 Summary: org.apache.hadoop.mapred.LineRecordReader does not handle 
multibyte record delimiters well
 Key: HADOOP-9867
 URL: https://issues.apache.org/jira/browse/HADOOP-9867
 Project: Hadoop Common
  Issue Type: Bug
  Components: io
Affects Versions: 0.20.2
 Environment: CDH3U2 Redhat linux 5.7
Reporter: Kris Geusebroek


Having defined a recorddelimiter of multiple bytes in a new InputFileFormat 
sometimes has the effect of skipping records from the input.

This happens when the input splits are split off just after a recordseparator. 
Starting point for the next split would be non zero and skipFirstLine would be 
true. A seek into the file is done to start - 1 and the text until the first 
recorddelimiter is ignored (due to the presumption that this record is already 
handled by the previous maptask). Since the re ord delimiter is multibyte the 
seek only got the last byte of the delimiter into scope and its not recognized 
as a full delimiter. So the text is skipped until the next delimiter (ignoring 
a full record!!)


--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9868) Server must not advertise kerberos realm

2013-08-13 Thread Daryn Sharp (JIRA)
Daryn Sharp created HADOOP-9868:
---

 Summary: Server must not advertise kerberos realm
 Key: HADOOP-9868
 URL: https://issues.apache.org/jira/browse/HADOOP-9868
 Project: Hadoop Common
  Issue Type: Bug
  Components: ipc
Affects Versions: 3.0.0, 2.1.1-beta
Reporter: Daryn Sharp
Assignee: Daryn Sharp
Priority: Blocker


HADOOP-9789 broke kerberos authentication by making the RPC server advertise 
the kerberos service principal realm.  SASL clients and servers do not support 
specifying a realm, so it must be removed from the advertisement.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Resolved] (HADOOP-9789) Support server advertised kerberos principals

2013-08-13 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp resolved HADOOP-9789.
-

Resolution: Fixed

Will be fixed by HADOOP-9868.

> Support server advertised kerberos principals
> -
>
> Key: HADOOP-9789
> URL: https://issues.apache.org/jira/browse/HADOOP-9789
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: ipc, security
>Affects Versions: 2.0.0-alpha, 3.0.0
>Reporter: Daryn Sharp
>Assignee: Daryn Sharp
>Priority: Critical
> Fix For: 3.0.0, 2.1.1-beta
>
> Attachments: HADOOP-9789.2.patch, HADOOP-9789.patch, 
> HADOOP-9789.patch, hadoop-ojoshi-datanode-HW10351.local.log, 
> hadoop-ojoshi-namenode-HW10351.local.log
>
>
> The RPC client currently constructs the kerberos principal based on the a 
> config value, usually with an _HOST substitution.  This means the service 
> principal must match the hostname the client is using to connect.  This 
> causes problems:
> * Prevents using HA with IP failover when the servers have distinct 
> principals from the failover hostname
> * Prevents clients from being able to access a service bound to multiple 
> interfaces.  Only the interface that matches the server's principal may be 
> used.
> The client should be able to use the SASL advertised principal (HADOOP-9698), 
> with appropriate safeguards, to acquire the correct service ticket.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-9869) Configuration.getSocketAddr() should use getTrimmed()

2013-08-13 Thread Steve Loughran (JIRA)
Steve Loughran created HADOOP-9869:
--

 Summary:  Configuration.getSocketAddr() should use getTrimmed()
 Key: HADOOP-9869
 URL: https://issues.apache.org/jira/browse/HADOOP-9869
 Project: Hadoop Common
  Issue Type: Improvement
  Components: conf
Affects Versions: 3.0.0, 2.1.0-beta, 1.3.0
Reporter: Steve Loughran
Priority: Minor


YARN-1059 has shown that the hostname:port string used for the address of 
things like the RM isn't trimmed before its parsed, leading to errors that 
aren't that obvious. 

We should trim it -it's clearly not going to break any existing (valid) 
configurations

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [VOTE] Release Apache Hadoop 2.1.0-beta

2013-08-13 Thread Karthik Kambatla
Hi Arun,

Would it be possible to include YARN-1056 in the next RC - it is a
straight-forward config change. I marked it as a blocker for 2.1.0.

Thanks
Karthik


On Thu, Aug 8, 2013 at 8:14 AM, Kihwal Lee  wrote:

> Another blocker, HADOOP-9850, has been committed.
>
>
> Kihwal
>
>
> 
>  From: Arun C Murthy 
> To: Daryn Sharp 
> Cc: "" ; "
> mapreduce-...@hadoop.apache.org" ; "
> yarn-...@hadoop.apache.org" ; "
> common-dev@hadoop.apache.org" 
> Sent: Thursday, August 1, 2013 1:30 PM
> Subject: Re: [VOTE] Release Apache Hadoop 2.1.0-beta
>
>
> Ok, thanks for heads up Daryn. I'll spin an RC2 once HADOOP-9816 gets in -
> I'd appreciate if you could help push the fix in ASAP.
>
> Thanks again!
>
> Arun
>
> On Aug 1, 2013, at 9:38 AM, Daryn Sharp  wrote:
>
> > I broke RPC QOP for integrity and privacy options. :(  See blocker
> HADOOP-9816.  I think I understand the problem and it shouldn't be hard to
> fix.
> >
> > The bug went unnoticed because sadly there are no unit tests for the QOP
> options, even though it just involves a conf setting.
> >
> > Daryn
> >
> >
> > On Jul 29, 2013, at 5:00 PM, Arun C Murthy wrote:
> >
> >> Ok, I think we are close to rc1 now - the last of blockers should be
> committed later today… I'll try and spin RC1 tonight.
> >>
> >> thanks,
> >> Arun
> >>
> >> On Jul 21, 2013, at 12:43 AM, Devaraj Das  wrote:
> >>
> >>> I have just raised https://issues.apache.org/jira/browse/HDFS-5016 ..
> This
> >>> bug can easily be reproduced by some HBase tests. I'd like this to be
> >>> considered before we make a beta release. Have spoken about this with
> some
> >>> hdfs folks offline and I am told that it is being worked on.
> >>>
> >>> Thanks
> >>> Devaraj
> >>>
> >>>
> >>> On Wed, Jul 17, 2013 at 4:25 PM, Alejandro Abdelnur  >wrote:
> >>>
>  As I've mentioned in my previous email, if we get YARN-701 in, we
> should
>  also get in the fix for unmanaged AMs in an un-secure setup in
> 2.1.0-beta.
>  Else is a regression of a functionality it is already working.
> 
>  Because of that, to avoid continuing delaying the release, I'm
> suggesting
>  to mention in the release notes the API changes and behavior changes
> that
>  YARN-918 and YARN-701 will bring into the next beta or GA release.
> 
>  thx
> 
> 
>  On Wed, Jul 17, 2013 at 4:14 PM, Vinod Kumar Vavilapalli <
>  vino...@hortonworks.com> wrote:
> 
> >
> > On Jul 17, 2013, at 1:04 PM, Alejandro Abdelnur wrote:
> >
> >> * YARN-701
> >>
> >> It should be addressed before a GA release.
> >>
> >> Still, as it is this breaks unmanaged AMs and to me
> >> that would be a blocker for the beta.
> >>
> >> YARN-701 and the unmanaged AMs fix should be committed
> >> in tandem.
> >>
> >> * YARN-918
> >>
> >> It is a consequence of YARN-701 and depends on it.
> >
> >
> >
> > YARN-918 is an API change. And YARN-701 is a behaviour change. We
> need
> > both in 2.1.0.
> >
> >
> >
> >> * YARN-926
> >>
> >> It would be nice to have it addressed before GA release.
> >
> >
> > Either ways. I'd get it in sooner than later specifically when we are
> > trying to replace the old API with the new one.
> >
> > Thanks,
> > +Vino
> >
> >
> 
> >>
> >> --
> >> Arun C. Murthy
> >> Hortonworks Inc.
> >> http://hortonworks.com/
> >>
> >>
> >
>
> --
> Arun C. Murthy
> Hortonworks Inc.
> http://hortonworks.com/
>


[jira] [Created] (HADOOP-9870) Mixed configurations for JVM -Xmx in hadoop command

2013-08-13 Thread Wei Yan (JIRA)
Wei Yan created HADOOP-9870:
---

 Summary: Mixed configurations for JVM -Xmx in hadoop command
 Key: HADOOP-9870
 URL: https://issues.apache.org/jira/browse/HADOOP-9870
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Wei Yan


When we use hadoop command to launch a class, there are two places setting the 
-Xmx configuration.

*1*. The first place is located in file 
{{hadoop-common-project/hadoop-common/src/main/bin/hadoop}}.
{code}
exec "$JAVA" $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"
{code}
Here $JAVA_HEAP_MAX is configured in hadoop-config.sh 
({{hadoop-common-project/hadoop-common/src/main/bin/hadoop-config.sh}}). The 
default value is "-Xmx1000m".

*2*. The second place is set with $HADOOP_OPTS in file 
{{hadoop-common-project/hadoop-common/src/main/bin/hadoop}}.
{code}
HADOOP_OPTS="$HADOOP_OPTS $HADOOP_CLIENT_OPTS"
{code}
Here $HADOOP_CLIENT_OPTS is set in hadoop-env.sh 
({{hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh}})
{code}
export HADOOP_CLIENT_OPTS="-Xmx512m $HADOOP_CLIENT_OPTS"
{code}

Currently the final default java command looks like:
{code}java -Xmx1000m  -Xmx512m CLASS_NAME ARGUMENTS"{code}

And if users also specify the -Xmx in the $HADOOP_CLIENT_OPTS, there will be 
three -Xmx configurations. 

The hadoop setup tutorial only discusses hadoop-env.sh, and it looks that users 
should not make any change in hadoop-config.sh.

We should let hadoop smart to choose the right one before launching the java 
command, instead of leaving for jvm to make the decision.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Alejandro Abdelnur
There is no indication that protoc 2.5.0 is breaking anything.

Hadoop-trunk builds have been failing way before 1/2 way with:

---


[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test
(default-test) on project hadoop-yarn-client: ExecutionException;
nested exception is java.util.concurrent.ExecutionException:
java.lang.RuntimeException: The forked VM terminated without saying
properly goodbye. VM crash or System.exit called ? -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test
(default-test) on project hadoop-yarn-client: ExecutionException;
nested exception is java.util.concurrent.ExecutionException:
java.lang.RuntimeException: The forked VM terminated without saying
properly goodbye. VM crash or System.exit called ?

---


The Hadoop-trunk #480 build failed with a JVM abort in a testcase towards
the end of mapreduce tests.

Until then there were no failures at all.

I've increased heap size and tried a second run and the failure was earlier.

I've looked a Hadoop-trunk builds prior to the HADOOP-9845 and it has been
failing the same way in all the kept builds.

We need to fix Hadoop-trunk builds independently of this.

Any objection to commit HADOOP-9845 to branch-2 and the 2.1.0-beta branches
to get all the other jenkins jobs working?

I'll wait till tomorrow morning before proceeding.

Thx




On Mon, Aug 12, 2013 at 8:35 PM, Alejandro Abdelnur wrote:

> Jenkins is running a full test run on trunk using protoc 2.5.0.
>
>   https://builds.apache.org/job/Hadoop-trunk/480
>
> And it seems go be going just fine.
>
> If everything looks OK, I'm planing to backport HADOOP-9845 to the
> 2.1.0-beta branch midday PST tomorrow. This will normalize all builds
> failures do the protoc mismatch.
>
> Thanks.
>
> Alejandro
>
>
> On Mon, Aug 12, 2013 at 5:53 PM, Alejandro Abdelnur wrote:
>
>> shooting to get it i n for 2.1.0.
>>
>> at moment is in trunk till the nightly finishes. then we'll decide
>>
>> in the mean time, you can have multiple versions installed in diff dirs
>> and set the right one in the path
>>
>> thx
>>
>> Alejandro
>> (phone typing)
>>
>> On Aug 12, 2013, at 17:47, Konstantin Shvachko 
>> wrote:
>>
>> > Ok. After installing protobuf 2.5.0 I can compile trunk.
>> > But now I cannot compile Hadoop-2 branches. None of them.
>> > So if I switch between branches I need to reinstall protobuf?
>> >
>> > Is there a consensus about going towards protobuf 2.5.0 upgrade in ALL
>> > versions?
>> > I did not get definite impression there is.
>> > If not it could be a pretty big disruption.
>> >
>> > Thanks,
>> > --Konst
>> >
>> >
>> >
>> > On Mon, Aug 12, 2013 at 3:19 PM, Alejandro Abdelnur > >wrote:
>> >
>> >> I've just committed HADOOP-9845 to trunk (only trunk at the moment).
>> >>
>> >> To build trunk now you need protoc 2.5.0 (the build will fail with a
>> >> warning if you don't have it).
>> >>
>> >> We'd propagate this to the 2 branches once the precommit build is back
>> to
>> >> normal and see things are OK.
>> >>
>> >> Thanks.
>> >>
>> >>
>> >> On Mon, Aug 12, 2013 at 2:57 PM, Alejandro Abdelnur > >>> wrote:
>> >>
>> >>> About to commit HADOOP-9845 to trunk, in 5 mins. This will make trunk
>> use
>> >>> protoc 2.5.0.
>> >>>
>> >>> thx
>> >>>
>> >>>
>> >>> On Mon, Aug 12, 2013 at 11:47 AM, Giridharan Kesavan <
>> >>> gkesa...@hortonworks.com> wrote:
>> >>>
>>  I can take care of re-installing 2.4 and installing 2.5 in a
>> different
>>  location. This would fix 2.0 branch builds as well.
>>  Thoughts?
>> 
>>  -Giri
>> 
>> 
>>  On Mon, Aug 12, 2013 at 11:37 AM, Alejandro Abdelnur <
>> t...@cloudera.com
>> > wrote:
>> 
>> > Giri,
>> >
>> > first of all, thanks for installing protoc 2.5.0.
>> >
>> > I didn't know we were installing them as the only version and not
>>  driven by
>> > env/path settings.
>> >
>> > Now we have a bit of a problem, precommit builds are broken because
>> of
>> > mismatch of protoc (2.5.0) and protobuf JAR( 2.4.1).
>> >
>> > We have to options:
>> >
>> > 1* commit HADOOP-9845 that will bring protobuf to 2.5.0 and iron out
>> >> any
>> > follow up issues.
>> > 2* reinstall protoc 2.4.1 in the jenkins machines and have 2.4.1 and
>>  2.5.0
>> > coexisting
>> >
>> > My take would be to commit HADOOP-9845 in trunk, iron out any issues
>> >> an
>> > then merge it to the other branches.
>> >
>> > We need to sort this out quickly as precommits are not working.
>> >
>> > I'll wait till 3PM today  for objections to option #1, if none I'll
>>  commit
>> > it to trunk.
>> >
>> > Thanks.
>> >
>> > Alejandro
>> >
>> >
>> >
>> > On Mon, Aug 12, 2013 at 11:30 AM, Giridharan Kesavan <
>> > gkesa...@hortonworks.com> wrote:
>> >
>> >> Like I said protoc is upgraded 

[jira] [Created] (HADOOP-9871) Fix intermittent findbug warnings in DefaultMetricsSystem

2013-08-13 Thread Luke Lu (JIRA)
Luke Lu created HADOOP-9871:
---

 Summary: Fix intermittent findbug warnings in DefaultMetricsSystem
 Key: HADOOP-9871
 URL: https://issues.apache.org/jira/browse/HADOOP-9871
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Luke Lu
Assignee: Junping Du
Priority: Minor


Findbugs sometimes (not always) picks up warnings from DefaultMetricsSystem due 
to some of the fields not being transient for serializable class 
(DefaultMetricsSystem is an Enum (which is serializable)). 

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Steve Loughran
On 13 August 2013 13:09, Alejandro Abdelnur  wrote:

> There is no indication that protoc 2.5.0 is breaking anything.
>


clearly then this is not a stack trace:

INFO]

[INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
[INFO]

Downloading:
https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
Downloading:
http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
Downloading:
http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
Downloaded:
http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
KB at 185.9 KB/sec)
Downloading:
https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
Downloading:
http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
Downloading:
http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
Downloaded:
http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
KB at 7039.9 KB/sec)
[INFO]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
---
[INFO] Deleting
/Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
[INFO]
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common ---
[INFO] Executing tasks

main:
[mkdir] Created dir:
/Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
[mkdir] Created dir:
/Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
[INFO] Executed tasks
[INFO]
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
hadoop-common ---
[WARNING] [protoc, --version] failed with error code 1
[ERROR] protoc, could not get version
[INFO]

[INFO] Reactor Summary:
[INFO]


Assuming this is just a versioning issue, can you update the documentation
in the wiki &
http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
be consistent with the current protobuf requirements. I do want to
follow the instructions not just because I am lazy, but because I want to
manually test the installation process itself

Once that's done I will try to follow these instructions to get protobuf
2.5 installed on my homebrew-managed mac.

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Alejandro Abdelnur
Steve, this is a version issue, if you get protoc 2.5.0 in your PATH things
will  work.

Apologies for the the hiccups until we get all this sorted out, we had some
miscommunication on how to install protoc in the jenkins boxes and instead
having 2.4.1 and 2.5.0 side to side we got only 2.5.0.

By tomorrow we should have things mostly sorted out.

Thanks


On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran wrote:

> On 13 August 2013 13:09, Alejandro Abdelnur  wrote:
>
> > There is no indication that protoc 2.5.0 is breaking anything.
> >
>
>
> clearly then this is not a stack trace:
>
> INFO]
> 
> [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
> [INFO]
> 
> Downloading:
>
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloading:
>
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloading:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloaded:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
> KB at 185.9 KB/sec)
> Downloading:
>
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloading:
>
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloading:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloaded:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
> KB at 7039.9 KB/sec)
> [INFO]
> [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
> ---
> [INFO] Deleting
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
> [INFO]
> [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
> ---
> [INFO] Executing tasks
>
> main:
> [mkdir] Created dir:
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
> [mkdir] Created dir:
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
> [INFO] Executed tasks
> [INFO]
> [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
> hadoop-common ---
> [WARNING] [protoc, --version] failed with error code 1
> [ERROR] protoc, could not get version
> [INFO]
> 
> [INFO] Reactor Summary:
> [INFO]
>
>
> Assuming this is just a versioning issue, can you update the documentation
> in the wiki &
>
> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
> be consistent with the current protobuf requirements. I do want to
> follow the instructions not just because I am lazy, but because I want to
> manually test the installation process itself
>
> Once that's done I will try to follow these instructions to get protobuf
> 2.5 installed on my homebrew-managed mac.
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>



-- 
Alejandro


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Luke Lu
I've verified that it's only a version issue (tested on a Ubuntu 12.04 VM)
: as long as you have 2.5.0 protoc, it works. BTW, the version check is a
little "too" strict. I had protobuf 2.5.1 (trunk) installed for 2.5 tests
and the exact check broke my build.


On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran wrote:

> On 13 August 2013 13:09, Alejandro Abdelnur  wrote:
>
> > There is no indication that protoc 2.5.0 is breaking anything.
> >
>
>
> clearly then this is not a stack trace:
>
> INFO]
> 
> [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
> [INFO]
> 
> Downloading:
>
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloading:
>
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloading:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> Downloaded:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
> KB at 185.9 KB/sec)
> Downloading:
>
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloading:
>
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloading:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> Downloaded:
>
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
> KB at 7039.9 KB/sec)
> [INFO]
> [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
> ---
> [INFO] Deleting
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
> [INFO]
> [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
> ---
> [INFO] Executing tasks
>
> main:
> [mkdir] Created dir:
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
> [mkdir] Created dir:
>
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
> [INFO] Executed tasks
> [INFO]
> [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
> hadoop-common ---
> [WARNING] [protoc, --version] failed with error code 1
> [ERROR] protoc, could not get version
> [INFO]
> 
> [INFO] Reactor Summary:
> [INFO]
>
>
> Assuming this is just a versioning issue, can you update the documentation
> in the wiki &
>
> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
> be consistent with the current protobuf requirements. I do want to
> follow the instructions not just because I am lazy, but because I want to
> manually test the installation process itself
>
> Once that's done I will try to follow these instructions to get protobuf
> 2.5 installed on my homebrew-managed mac.
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Steve Loughran
On 13 August 2013 16:20, Alejandro Abdelnur  wrote:

> Steve, this is a version issue, if you get protoc 2.5.0 in your PATH things
> will  work.
>

I assume that, but as the YARN docs still talk about 0.24, they need to be
updated too


>
> Apologies for the the hiccups until we get all this sorted out, we had some
> miscommunication on how to install protoc in the jenkins boxes and instead
> having 2.4.1 and 2.5.0 side to side we got only 2.5.0.
>
> By tomorrow we should have things mostly sorted out.
>
> Thanks
>
>
> On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran  >wrote:
>
> > On 13 August 2013 13:09, Alejandro Abdelnur  wrote:
> >
> > > There is no indication that protoc 2.5.0 is breaking anything.
> > >
> >
> >
> > clearly then this is not a stack trace:
> >
> > INFO]
> > 
> > [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
> > [INFO]
> > 
> > Downloading:
> >
> >
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > Downloading:
> >
> >
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > Downloading:
> >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > Downloaded:
> >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
> > KB at 185.9 KB/sec)
> > Downloading:
> >
> >
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > Downloading:
> >
> >
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > Downloading:
> >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > Downloaded:
> >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
> > KB at 7039.9 KB/sec)
> > [INFO]
> > [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-common
> > ---
> > [INFO] Deleting
> >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
> > [INFO]
> > [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
> > ---
> > [INFO] Executing tasks
> >
> > main:
> > [mkdir] Created dir:
> >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
> > [mkdir] Created dir:
> >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
> > [INFO] Executed tasks
> > [INFO]
> > [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @
> > hadoop-common ---
> > [WARNING] [protoc, --version] failed with error code 1
> > [ERROR] protoc, could not get version
> > [INFO]
> > 
> > [INFO] Reactor Summary:
> > [INFO]
> >
> >
> > Assuming this is just a versioning issue, can you update the
> documentation
> > in the wiki &
> >
> >
> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
> > be consistent with the current protobuf requirements. I do want to
> > follow the instructions not just because I am lazy, but because I want to
> > manually test the installation process itself
> >
> > Once that's done I will try to follow these instructions to get protobuf
> > 2.5 installed on my homebrew-managed mac.
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
> >
>
>
>
> --
> Alejandro
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: [UPDATE] Upgrade to protobuf 2.5.0 for the 2.1.0 release, HADOOP-9845

2013-08-13 Thread Alejandro Abdelnur
yep, will take of that. Also, Colin suggested me offline that we should
improve the protoc plugin to pick up the path from an ENV var if present to
simplify things for folks that need to build versions of hadoop using diff
versions of protoc. I'll work on this too.

thx


On Tue, Aug 13, 2013 at 4:41 PM, Steve Loughran wrote:

> On 13 August 2013 16:20, Alejandro Abdelnur  wrote:
>
> > Steve, this is a version issue, if you get protoc 2.5.0 in your PATH
> things
> > will  work.
> >
>
> I assume that, but as the YARN docs still talk about 0.24, they need to be
> updated too
>
>
> >
> > Apologies for the the hiccups until we get all this sorted out, we had
> some
> > miscommunication on how to install protoc in the jenkins boxes and
> instead
> > having 2.4.1 and 2.5.0 side to side we got only 2.5.0.
> >
> > By tomorrow we should have things mostly sorted out.
> >
> > Thanks
> >
> >
> > On Tue, Aug 13, 2013 at 3:29 PM, Steve Loughran  > >wrote:
> >
> > > On 13 August 2013 13:09, Alejandro Abdelnur  wrote:
> > >
> > > > There is no indication that protoc 2.5.0 is breaking anything.
> > > >
> > >
> > >
> > > clearly then this is not a stack trace:
> > >
> > > INFO]
> > >
> 
> > > [INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
> > > [INFO]
> > >
> 
> > > Downloading:
> > >
> > >
> >
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > > Downloading:
> > >
> > >
> >
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > > Downloading:
> > >
> > >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom
> > > Downloaded:
> > >
> > >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.pom(9
> > > KB at 185.9 KB/sec)
> > > Downloading:
> > >
> > >
> >
> https://repository.apache.org/content/repositories/snapshots/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > > Downloading:
> > >
> > >
> >
> http://repository.jboss.org/nexus/content/groups/public/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > > Downloading:
> > >
> > >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
> > > Downloaded:
> > >
> > >
> >
> http://repo.maven.apache.org/maven2/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar(521
> > > KB at 7039.9 KB/sec)
> > > [INFO]
> > > [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @
> hadoop-common
> > > ---
> > > [INFO] Deleting
> > >
> > >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target
> > > [INFO]
> > > [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @
> hadoop-common
> > > ---
> > > [INFO] Executing tasks
> > >
> > > main:
> > > [mkdir] Created dir:
> > >
> > >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test-dir
> > > [mkdir] Created dir:
> > >
> > >
> >
> /Users/stevel/Projects/hadoop-trunk/hadoop-common-project/hadoop-common/target/test/data
> > > [INFO] Executed tasks
> > > [INFO]
> > > [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc)
> @
> > > hadoop-common ---
> > > [WARNING] [protoc, --version] failed with error code 1
> > > [ERROR] protoc, could not get version
> > > [INFO]
> > >
> 
> > > [INFO] Reactor Summary:
> > > [INFO]
> > >
> > >
> > > Assuming this is just a versioning issue, can you update the
> > documentation
> > > in the wiki &
> > >
> > >
> >
> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-yarn-project/hadoop-yarn/READMEto
> > > be consistent with the current protobuf requirements. I do want to
> > > follow the instructions not just because I am lazy, but because I want
> to
> > > manually test the installation process itself
> > >
> > > Once that's done I will try to follow these instructions to get
> protobuf
> > > 2.5 installed on my homebrew-managed mac.
> > >
> > > --
> > > CONFIDENTIALITY NOTICE
> > > NOTICE: This message is intended for the use of the individual or
> entity
> > to
> > > which it is addressed and may contain information that is confidential,
> > > privileged and exempt from disclosure under applicable law. If the
> reader
> > > of this message is not the intended recipient, you are hereby notified
> > that
> > > any printing, copying, dissemination, distribution, disclosure or
> > > forwarding of this communication is strictly prohibited. If you have
> > > received this communication in error, please contact the sender
> > immediately
> > > and delete it from your system. Thank You.
> > >
> >
> >
> >
> > --
> > Alejandro
> >
>
> --
> CONFIDENTIALITY NOTICE

Re: confirm unsubscribe from common-dev@hadoop.apache.org

2013-08-13 Thread syates

Quoting common-dev-h...@hadoop.apache.org:


Hi! This is the ezmlm program. I'm managing the
common-dev@hadoop.apache.org mailing list.

I'm working for my owner, who can be reached
at common-dev-ow...@hadoop.apache.org.

To confirm that you would like

   sya...@stevendyates.com

removed from the common-dev mailing list, please send a short reply
to this address:


common-dev-uc.1376436889.aopejedillbfiiahoneg-syates=stevendyates@hadoop.apache.org


Usually, this happens when you just hit the "reply" button.
If this does not work, simply copy the address and paste it into
the "To:" field of a new message.

or click here:

mailto:common-dev-uc.1376436889.aopejedillbfiiahoneg-syates=stevendyates@hadoop.apache.org

I haven't checked whether your address is currently on the mailing list.
To see what address you used to subscribe, look at the messages you are
receiving from the mailing list. Each message has your address hidden
inside its return path; for example, m...@xdd.ff.com receives messages
with return path:  
-mary=xdd.ff@hadoop.apache.org.


Some mail programs are broken and cannot handle long addresses. If you
cannot reply to this request, instead send a message to
 and put the entire address  
listed above

into the "Subject:" line.


--- Administrative commands for the common-dev list ---

I can handle administrative requests automatically. Please
do not send them to the list address! Instead, send
your message to the correct command address:

To subscribe to the list, send a message to:
   

To remove your address from the list, send a message to:
   

Send mail to the following for info and FAQ for this list:
   
   

Similar addresses exist for the digest list:
   
   

To get messages 123 through 145 (a maximum of 100 per request), mail:
   

To get an index with subject and author for messages 123-456 , mail:
   

They are always returned as sets of 100, max 2000 per request,
so you'll actually get 100-499.

To receive all messages with the same subject as message 12345,
send a short message to:
   

The messages should contain one line or word of text to avoid being
treated as sp@m, but I will ignore their content.
Only the ADDRESS you send to is important.

You can start a subscription for an alternate address,
for example "john@host.domain", just add a hyphen and your
address (with '=' instead of '@') after the command word:


To stop subscription for this address, mail:


In both cases, I'll send a confirmation message to that address. When
you receive it, simply reply to it to complete your subscription.

If despite following these instructions, you do not get the
desired results, please contact my owner at
common-dev-ow...@hadoop.apache.org. Please be patient, my owner is a
lot slower than I am ;-)

--- Enclosed is a copy of the request I received.

Return-Path: 
Received: (qmail 68133 invoked by uid 99); 13 Aug 2013 23:34:49 -
Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230)
by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 13 Aug 2013 23:34:49 +
X-ASF-Spam-Status: No, hits=-6.7 required=10.0

tests=ASF_EMPTY_LIST_OPS,ASF_LIST_OPS,ASF_LIST_UNSUB_A,DCC_CHECK,EMPTY_MESSAGE,HTML_MESSAGE,MIME_HTML_MOSTLY,MISSING_SUBJECT
X-Spam-Check-By: apache.org
Received-SPF: error (nike.apache.org: local policy)
Received: from [69.89.23.142] (HELO  
gproxy4-pub.mail.unifiedlayer.com) (69.89.23.142)

by apache.org (qpsmtpd/0.29) with SMTP; Tue, 13 Aug 2013 23:34:42 +
Received: (qmail 10263 invoked by uid 0); 13 Aug 2013 23:34:00 -
Received: from unknown (HELO mailchannelsproxy4.unifiedlayer.com)  
(66.147.243.73)

  by gproxy4.unifiedlayer.com with SMTP; 13 Aug 2013 23:34:00 -
X-Sender-Id: {1135:just81.justhost.com:stevend4:stevendyates.com}  
{sentby:smtp auth 101.119.15.112 authed with syates+stevendyates.com}

Received: from just81.justhost.com (just81.justhost.com [173.254.28.81])
by 0.0.0.0:2500 (trex/4.8.85);
Tue, 13 Aug 2013 23:34:00 GMT
DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed;  
d=stevendyates.com; s=default;
	h=Content-Type:MIME-Version:Reply-To:To:From:Message-ID:Date;  
bh=VzTPxlbEJyUU4Gyzuy6t5FvvwFzGNWBhEE6KBMccX3Q=;


b=LA0/yTL0AuyXPSNJBYPd1nkEIbfJcd0wObXixEPvB/ztF3nr/YMV5nQaLb+EbJg+kQW0PyhPVgkBjzVUOwZLaAHMIrkpM/K0P0JSt/wmgmeSKkokRtMrYdJ5PTmG6vdc;
Received: from [101.119.15.112] (port=33907 helo=[100.125.141.91])
by just81.justhost.com with esmtpsa (TLSv1:RC4-SHA:128)
(Exim 4.80)
(envelope-from )
id 1V9Mw6-0005u8-MR
	for common-dev-unsubscr...@hadoop.apache.org; Tue, 13 Aug 2013  
16:19:32 -0600

Date: Wed, 14 Aug 2013 08:19:26 +1000
Message-ID: <2ahd0cenh3br7ni7lgpqx19n.1376432366...@email.android.com>
Importance: normal
From: Steve Yates 
To: common-dev-unsubscr...@hadoop.apache.org
Reply-To: Steve Yates 
MIME-Version: 1.0
Content-Type: multipart/alternative;  
boundary="--_com.android.email_988206027850420"
X-Identified-User:  
{1135

[jira] [Resolved] (HADOOP-9346) Upgrading to protoc 2.5.0 fails the build

2013-08-13 Thread Harsh J (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Harsh J resolved HADOOP-9346.
-

Resolution: Duplicate

Thanks for pinging Ravi. I'd discussed with Alejandro that this could be 
closed. Looks like we added a dupe link but failed to close. Closing now.

> Upgrading to protoc 2.5.0 fails the build
> -
>
> Key: HADOOP-9346
> URL: https://issues.apache.org/jira/browse/HADOOP-9346
> Project: Hadoop Common
>  Issue Type: Task
>  Components: build
>Affects Versions: 3.0.0
>Reporter: Harsh J
>Assignee: Harsh J
>Priority: Minor
>  Labels: protobuf
> Attachments: HADOOP-9346.patch
>
>
> Reported over the impala lists, one of the errors received is:
> {code}
> src/hadoop-common-project/hadoop-common/target/generated-sources/java/org/apache/hadoop/ha/proto/ZKFCProtocolProtos.java:[104,37]
>  can not find symbol.
> symbol: class Parser
> location: package com.google.protobuf
> {code}
> Worth looking into as we'll eventually someday bump our protobuf deps.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: [VOTE] Release Apache Hadoop 2.0.6-alpha

2013-08-13 Thread Konstantin Shvachko
+1
Verified checksums, signatures.
Checked release notes.
Built the sources and ran tests.
Started a small cluster.
Tried hadoop commands, ran a few jobs.

Thanks,
--Konst



On Sat, Aug 10, 2013 at 5:46 PM, Konstantin Boudnik  wrote:

> All,
>
> I have created a release candidate (rc0) for hadoop-2.0.6-alpha that I
> would
> like to release.
>
> This is a stabilization release that includes fixed for a couple a of
> issues
> as outlined on the security list.
>
> The RC is available at:
> http://people.apache.org/~cos/hadoop-2.0.6-alpha-rc0/
> The RC tag in svn is here:
> http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.6-alpha-rc0
>
> The maven artifacts are available via repository.apache.org.
>
> Please try the release bits and vote; the vote will run for the usual 7
> days.
>
> Thanks for your voting
>   Cos
>
>