[jira] [Created] (HADOOP-10986) hadoop tarball is twice as big as prev. version and 6 times as big unpacked

2014-08-20 Thread JIRA
André Kelpe created HADOOP-10986:


 Summary: hadoop tarball is twice as big as prev. version and 6 
times as big unpacked
 Key: HADOOP-10986
 URL: https://issues.apache.org/jira/browse/HADOOP-10986
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.5.0
Reporter: André Kelpe


I noticed that the binary tarball for 2.5.0 is almost 300MB, while 2.4.1 is 
only 132MB. Unpacking the latest tarball gives me 1.8 GB of stuff, with the 
majority in the "share" directory.
 
{code}
$ cd hadoop-2.4.1
$ du -sh *
364Kbin
356Ketc
100Kinclude
2,3Mlib
128Klibexec
24K LICENSE.txt
12K NOTICE.txt
12K README.txt
336Ksbin
280Mshare
{code}

{code}
 $ cd hadoop-2.5.0 
 $ du -sh *
512Kbin
332Ketc
100Kinclude
4,6Mlib
128Klibexec
336Ksbin
1,8Gshare
{code}

I also saw some warnings from tar while unpacking:

{code}
$ tar xf hadoop-2.5.0.tar.gz 
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
tar: Ignoring unknown extended header keyword `SCHILY.dev'
tar: Ignoring unknown extended header keyword `SCHILY.ino'
tar: Ignoring unknown extended header keyword `SCHILY.nlink'
{code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)


Build failed in Jenkins: Hadoop-Common-0.23-Build #1047

2014-08-20 Thread Apache Jenkins Server
See 

--
[...truncated 8263 lines...]
Running org.apache.hadoop.fs.TestGlobPattern
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.09 sec
Running org.apache.hadoop.fs.TestFcLocalFsUtil
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.58 sec
Running org.apache.hadoop.fs.TestLocalFileSystemPermission
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.53 sec
Running org.apache.hadoop.fs.TestDFVariations
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec
Running org.apache.hadoop.fs.permission.TestFsPermission
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.472 sec
Running org.apache.hadoop.fs.TestTrash
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.111 sec
Running org.apache.hadoop.fs.TestFileStatus
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.195 sec
Running org.apache.hadoop.fs.TestChecksumFileSystem
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.837 sec
Running org.apache.hadoop.fs.shell.TestPathExceptions
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec
Running org.apache.hadoop.fs.shell.TestCommandFactory
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.177 sec
Running org.apache.hadoop.fs.shell.TestPathData
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.772 sec
Running org.apache.hadoop.fs.shell.TestCopy
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.73 sec
Running org.apache.hadoop.fs.TestLocalFileSystem
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.96 sec
Running org.apache.hadoop.fs.TestFileContextDeleteOnExit
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.531 sec
Running org.apache.hadoop.fs.TestPath
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.853 sec
Running org.apache.hadoop.fs.TestLocalDirAllocator
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.994 sec
Running org.apache.hadoop.fs.TestFileSystemTokens
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.499 sec
Running org.apache.hadoop.fs.TestFileSystemCaching
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.89 sec
Running org.apache.hadoop.fs.TestFsShellCopy
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.354 sec
Running org.apache.hadoop.fs.TestListFiles
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.544 sec
Running org.apache.hadoop.fs.TestHardLink
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec
Running org.apache.hadoop.fs.TestLocalFSFileContextMainOperations
Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.259 sec
Running org.apache.hadoop.fs.TestLocal_S3FileContextURI
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.351 sec
Running org.apache.hadoop.fs.TestFsShellReturnCode
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.618 sec
Running org.apache.hadoop.fs.TestS3_LocalFileContextURI
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.344 sec
Running org.apache.hadoop.fs.TestFileUtil
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.684 sec
Running org.apache.hadoop.fs.s3native.TestInMemoryNativeS3FileSystemContract
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.088 sec
Running org.apache.hadoop.fs.TestLocalFSFileContextSymlink
Tests run: 61, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 2.688 sec
Running org.apache.hadoop.fs.TestFsShell
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.282 sec
Running org.apache.hadoop.fs.TestBlockLocation
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.055 sec
Running org.apache.hadoop.fs.TestTruncatedInputBug
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.465 sec
Running org.apache.hadoop.fs.TestCommandFormat
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.106 sec
Running org.apache.hadoop.fs.TestLocalFSFileContextCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.785 sec
Running org.apache.hadoop.fs.TestFSMainOperationsLocalFileSystem
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.672 sec
Running org.apache.hadoop.fs.s3.TestINode
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.057 sec
Running org.apache.hadoop.fs.s3.TestS3Credentials
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.128 sec
Running org.apache.hadoop.fs.s3.TestS3FileSystem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.154 sec
Running org.apache.hadoop.fs.s3.TestInMemoryS3FileSystemContract
Tests run: 29, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.893 sec
Running org.apache.hadoop.fs.TestHarFileSystem
Tests run: 3, Failures: 0, Errors: 0, Skipped: 

[jira] [Created] (HADOOP-10988) Community build Apache Hadoop 2.5 fails on Ubuntu 14.04

2014-08-20 Thread Amir Sanjar (JIRA)
Amir Sanjar created HADOOP-10988:


 Summary: Community build Apache Hadoop 2.5 fails on Ubuntu 14.04
 Key: HADOOP-10988
 URL: https://issues.apache.org/jira/browse/HADOOP-10988
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 2.4.1, 2.5.0, 2.2.0
 Environment: x86_64, Ubuntu 14.04, OpenJDK 7
hadoop 2.5 tar file from:  
http://apache.mirrors.pair.com/hadoop/common/hadoop-2.5.0/
Reporter: Amir Sanjar
 Fix For: 2.5.0


Executing any mapreduce applications (i.e. PI) using community version of 
hadoop build from http://apache.mirrors.pair.com/hadoop/common/hadoop-2.5.0/ 
fails with below error message. Rebuilding from source on an Ubuntu system with 
flags "clean -Pnative" fixes the problem.

OpenJDK 64-Bit Server VM warning: You have loaded library 
/home/ubuntu/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have 
disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', 
or link it with '-z noexecstack'.
14/08/19 21:24:54 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
java.net.ConnectException: Call From node1.maas/127.0.1.1 to localhost:9000 
failed on connection exception: java.net.ConnectException: Connection refused; 
For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Resolved] (HADOOP-10150) Hadoop cryptographic file system

2014-08-20 Thread Andrew Wang (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10150?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Wang resolved HADOOP-10150.
--

Resolution: Fixed

I've committed this to trunk as part of merging fs-encryption. Thanks for all 
the work from all contributors here, especially [~hitliuyi]!

> Hadoop cryptographic file system
> 
>
> Key: HADOOP-10150
> URL: https://issues.apache.org/jira/browse/HADOOP-10150
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: security
>Affects Versions: 3.0.0
>Reporter: Yi Liu
>Assignee: Yi Liu
>  Labels: rhino
> Fix For: 3.0.0
>
> Attachments: CryptographicFileSystem.patch, HADOOP cryptographic file 
> system-V2.docx, HADOOP cryptographic file system.pdf, 
> HDFSDataAtRestEncryptionAlternatives.pdf, 
> HDFSDataatRestEncryptionAttackVectors.pdf, 
> HDFSDataatRestEncryptionProposal.pdf, cfs.patch, extended information based 
> on INode feature.patch
>
>
> There is an increasing need for securing data when Hadoop customers use 
> various upper layer applications, such as Map-Reduce, Hive, Pig, HBase and so 
> on.
> HADOOP CFS (HADOOP Cryptographic File System) is used to secure data, based 
> on HADOOP “FilterFileSystem” decorating DFS or other file systems, and 
> transparent to upper layer applications. It’s configurable, scalable and fast.
> High level requirements:
> 1.Transparent to and no modification required for upper layer 
> applications.
> 2.“Seek”, “PositionedReadable” are supported for input stream of CFS if 
> the wrapped file system supports them.
> 3.Very high performance for encryption and decryption, they will not 
> become bottleneck.
> 4.Can decorate HDFS and all other file systems in Hadoop, and will not 
> modify existing structure of file system, such as namenode and datanode 
> structure if the wrapped file system is HDFS.
> 5.Admin can configure encryption policies, such as which directory will 
> be encrypted.
> 6.A robust key management framework.
> 7.Support Pread and append operations if the wrapped file system supports 
> them.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Resolved] (HADOOP-10697) KMS DelegationToken support

2014-08-20 Thread Alejandro Abdelnur (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alejandro Abdelnur resolved HADOOP-10697.
-

Resolution: Duplicate

dup of HADOOP-10770

> KMS DelegationToken support
> ---
>
> Key: HADOOP-10697
> URL: https://issues.apache.org/jira/browse/HADOOP-10697
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: security
>Affects Versions: 3.0.0
>Reporter: Alejandro Abdelnur
>Assignee: Alejandro Abdelnur
>
> Add DelegationToken support to KMS as per discussion in HDFS-6134.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (HADOOP-10989) Conditionally check the return value of getgrouplist on FreeBSD.

2014-08-20 Thread Chris Nauroth (JIRA)
Chris Nauroth created HADOOP-10989:
--

 Summary: Conditionally check the return value of getgrouplist on 
FreeBSD.
 Key: HADOOP-10989
 URL: https://issues.apache.org/jira/browse/HADOOP-10989
 Project: Hadoop Common
  Issue Type: Bug
  Components: native
Affects Versions: 3.0.0, 2.6.0
Reporter: Chris Nauroth
Assignee: Chris Nauroth


HADOOP-10781 corrected the handling of the return value from {{getgrouplist}} 
to work on FreeBSD.  However, it also regressed fixes that had been put in 
place to work around issues with {{getgrouplist}} on Linux.  This issue will 
restore that behavior, but still retain compatibility with FreeBSD.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (HADOOP-10990) Add missed NFSv3 request and response classes

2014-08-20 Thread Brandon Li (JIRA)
Brandon Li created HADOOP-10990:
---

 Summary: Add missed NFSv3 request and response classes
 Key: HADOOP-10990
 URL: https://issues.apache.org/jira/browse/HADOOP-10990
 Project: Hadoop Common
  Issue Type: Improvement
  Components: nfs
Reporter: Brandon Li
Assignee: Brandon Li


Two RPC calls were missed in original NFS implementation: LINK and MKNOD. This 
JIRA is to track the effort of adding the missed RPC calls.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


Re: Apache Hadoop 2.5.0 published tarballs are missing some txt files

2014-08-20 Thread Karthik Kambatla
Thanks for the suggestions, gents.

Given we are doing 2.5.1, I would like to get in HADOOP-10956 that fixes
the script. Is that alright with everyone?


On Tue, Aug 19, 2014 at 11:13 PM, Alejandro Abdelnur 
wrote:

> agree with Steve, lets do a 2.5.1 just with those txt files.
>
>
> On Tue, Aug 19, 2014 at 2:29 PM, Steve Loughran 
> wrote:
>
> > On 19 August 2014 18:35, Arun Murthy  wrote:
> >
> > > I suggest we do a 2.5.1 (with potentially other bug fixes) rather than
> > fix
> > > existing tarballs.
> > >
> >
> >
> > do we already have enough last-minute fixes to trigger a rerelease? With
> > all the testing that entails?
> >
> > A simple 2.5.1 rebuild with the same hadoop source tree would be a lot
> > easier to push out
> >
> >
> > >
> > > thanks,
> > > Arun
> > >
> > >
> > > On Mon, Aug 18, 2014 at 12:42 PM, Karthik Kambatla  >
> > > wrote:
> > >
> > > > Hi devs
> > > >
> > > > Tsuyoshi just brought it to my notice that the published tarballs
> don't
> > > > have LICENSE, NOTICE and README at the top-level. Instead, they are
> > only
> > > > under common, hdfs, etc.
> > > >
> > > > Now that we have already announced the release and the
> > jars/functionality
> > > > doesn't change, I propose we just update the tarballs with ones that
> > > > includes those files? I just untar-ed the published tarballs and
> copied
> > > > LICENSE, NOTICE and README from under common to the top directory and
> > > > tar-ed them back again.
> > > >
> > > > The updated tarballs are at:
> > > http://people.apache.org/~kasha/hadoop-2.5.0/
> > > > . Can someone please verify the signatures?
> > > >
> > > > If you would prefer an alternate action, please suggest.
> > > >
> > > > Thanks
> > > > Karthik
> > > >
> > > > PS: HADOOP-10956 should include the fix for these files also.
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > --
> > > Arun C. Murthy
> > > Hortonworks Inc.
> > > http://hortonworks.com/
> > >
> > > --
> > > CONFIDENTIALITY NOTICE
> > > NOTICE: This message is intended for the use of the individual or
> entity
> > to
> > > which it is addressed and may contain information that is confidential,
> > > privileged and exempt from disclosure under applicable law. If the
> reader
> > > of this message is not the intended recipient, you are hereby notified
> > that
> > > any printing, copying, dissemination, distribution, disclosure or
> > > forwarding of this communication is strictly prohibited. If you have
> > > received this communication in error, please contact the sender
> > immediately
> > > and delete it from your system. Thank You.
> > >
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
> >
>
>
>
> --
> Alejandro
>


Re: Apache Hadoop 2.5.0 published tarballs are missing some txt files

2014-08-20 Thread Arun Murthy
+1.

Thanks Karthik.

Arun


On Wed, Aug 20, 2014 at 4:25 PM, Karthik Kambatla 
wrote:

> Thanks for the suggestions, gents.
>
> Given we are doing 2.5.1, I would like to get in HADOOP-10956 that fixes
> the script. Is that alright with everyone?
>
>
> On Tue, Aug 19, 2014 at 11:13 PM, Alejandro Abdelnur 
> wrote:
>
> > agree with Steve, lets do a 2.5.1 just with those txt files.
> >
> >
> > On Tue, Aug 19, 2014 at 2:29 PM, Steve Loughran 
> > wrote:
> >
> > > On 19 August 2014 18:35, Arun Murthy  wrote:
> > >
> > > > I suggest we do a 2.5.1 (with potentially other bug fixes) rather
> than
> > > fix
> > > > existing tarballs.
> > > >
> > >
> > >
> > > do we already have enough last-minute fixes to trigger a rerelease?
> With
> > > all the testing that entails?
> > >
> > > A simple 2.5.1 rebuild with the same hadoop source tree would be a lot
> > > easier to push out
> > >
> > >
> > > >
> > > > thanks,
> > > > Arun
> > > >
> > > >
> > > > On Mon, Aug 18, 2014 at 12:42 PM, Karthik Kambatla <
> ka...@cloudera.com
> > >
> > > > wrote:
> > > >
> > > > > Hi devs
> > > > >
> > > > > Tsuyoshi just brought it to my notice that the published tarballs
> > don't
> > > > > have LICENSE, NOTICE and README at the top-level. Instead, they are
> > > only
> > > > > under common, hdfs, etc.
> > > > >
> > > > > Now that we have already announced the release and the
> > > jars/functionality
> > > > > doesn't change, I propose we just update the tarballs with ones
> that
> > > > > includes those files? I just untar-ed the published tarballs and
> > copied
> > > > > LICENSE, NOTICE and README from under common to the top directory
> and
> > > > > tar-ed them back again.
> > > > >
> > > > > The updated tarballs are at:
> > > > http://people.apache.org/~kasha/hadoop-2.5.0/
> > > > > . Can someone please verify the signatures?
> > > > >
> > > > > If you would prefer an alternate action, please suggest.
> > > > >
> > > > > Thanks
> > > > > Karthik
> > > > >
> > > > > PS: HADOOP-10956 should include the fix for these files also.
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > --
> > > > Arun C. Murthy
> > > > Hortonworks Inc.
> > > > http://hortonworks.com/
> > > >
> > > > --
> > > > CONFIDENTIALITY NOTICE
> > > > NOTICE: This message is intended for the use of the individual or
> > entity
> > > to
> > > > which it is addressed and may contain information that is
> confidential,
> > > > privileged and exempt from disclosure under applicable law. If the
> > reader
> > > > of this message is not the intended recipient, you are hereby
> notified
> > > that
> > > > any printing, copying, dissemination, distribution, disclosure or
> > > > forwarding of this communication is strictly prohibited. If you have
> > > > received this communication in error, please contact the sender
> > > immediately
> > > > and delete it from your system. Thank You.
> > > >
> > >
> > > --
> > > CONFIDENTIALITY NOTICE
> > > NOTICE: This message is intended for the use of the individual or
> entity
> > to
> > > which it is addressed and may contain information that is confidential,
> > > privileged and exempt from disclosure under applicable law. If the
> reader
> > > of this message is not the intended recipient, you are hereby notified
> > that
> > > any printing, copying, dissemination, distribution, disclosure or
> > > forwarding of this communication is strictly prohibited. If you have
> > > received this communication in error, please contact the sender
> > immediately
> > > and delete it from your system. Thank You.
> > >
> >
> >
> >
> > --
> > Alejandro
> >
>



-- 

--
Arun C. Murthy
Hortonworks Inc.
http://hortonworks.com/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: Apache Hadoop 2.5.0 published tarballs are missing some txt files

2014-08-20 Thread Allen Wittenauer


2.5.1 should fix HADOOP-10986 as well.


On Aug 20, 2014, at 4:29 PM, Arun Murthy  wrote:

> +1.
> 
> Thanks Karthik.
> 
> Arun
> 
> 
> On Wed, Aug 20, 2014 at 4:25 PM, Karthik Kambatla 
> wrote:
> 
>> Thanks for the suggestions, gents.
>> 
>> Given we are doing 2.5.1, I would like to get in HADOOP-10956 that fixes
>> the script. Is that alright with everyone?
>> 
>> 
>> On Tue, Aug 19, 2014 at 11:13 PM, Alejandro Abdelnur 
>> wrote:
>> 
>>> agree with Steve, lets do a 2.5.1 just with those txt files.
>>> 
>>> 
>>> On Tue, Aug 19, 2014 at 2:29 PM, Steve Loughran 
>>> wrote:
>>> 
 On 19 August 2014 18:35, Arun Murthy  wrote:
 
> I suggest we do a 2.5.1 (with potentially other bug fixes) rather
>> than
 fix
> existing tarballs.
> 
 
 
 do we already have enough last-minute fixes to trigger a rerelease?
>> With
 all the testing that entails?
 
 A simple 2.5.1 rebuild with the same hadoop source tree would be a lot
 easier to push out
 
 
> 
> thanks,
> Arun
> 
> 
> On Mon, Aug 18, 2014 at 12:42 PM, Karthik Kambatla <
>> ka...@cloudera.com
 
> wrote:
> 
>> Hi devs
>> 
>> Tsuyoshi just brought it to my notice that the published tarballs
>>> don't
>> have LICENSE, NOTICE and README at the top-level. Instead, they are
 only
>> under common, hdfs, etc.
>> 
>> Now that we have already announced the release and the
 jars/functionality
>> doesn't change, I propose we just update the tarballs with ones
>> that
>> includes those files? I just untar-ed the published tarballs and
>>> copied
>> LICENSE, NOTICE and README from under common to the top directory
>> and
>> tar-ed them back again.
>> 
>> The updated tarballs are at:
> http://people.apache.org/~kasha/hadoop-2.5.0/
>> . Can someone please verify the signatures?
>> 
>> If you would prefer an alternate action, please suggest.
>> 
>> Thanks
>> Karthik
>> 
>> PS: HADOOP-10956 should include the fix for these files also.
>> 
> 
> 
> 
> --
> 
> --
> Arun C. Murthy
> Hortonworks Inc.
> http://hortonworks.com/
> 
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or
>>> entity
 to
> which it is addressed and may contain information that is
>> confidential,
> privileged and exempt from disclosure under applicable law. If the
>>> reader
> of this message is not the intended recipient, you are hereby
>> notified
 that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender
 immediately
> and delete it from your system. Thank You.
> 
 
 --
 CONFIDENTIALITY NOTICE
 NOTICE: This message is intended for the use of the individual or
>> entity
>>> to
 which it is addressed and may contain information that is confidential,
 privileged and exempt from disclosure under applicable law. If the
>> reader
 of this message is not the intended recipient, you are hereby notified
>>> that
 any printing, copying, dissemination, distribution, disclosure or
 forwarding of this communication is strictly prohibited. If you have
 received this communication in error, please contact the sender
>>> immediately
 and delete it from your system. Thank You.
 
>>> 
>>> 
>>> 
>>> --
>>> Alejandro
>>> 
>> 
> 
> 
> 
> -- 
> 
> --
> Arun C. Murthy
> Hortonworks Inc.
> http://hortonworks.com/
> 
> -- 
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to 
> which it is addressed and may contain information that is confidential, 
> privileged and exempt from disclosure under applicable law. If the reader 
> of this message is not the intended recipient, you are hereby notified that 
> any printing, copying, dissemination, distribution, disclosure or 
> forwarding of this communication is strictly prohibited. If you have 
> received this communication in error, please contact the sender immediately 
> and delete it from your system. Thank You.