[jira] [Created] (HADOOP-11295) RPC Reader thread can't be shutdowned if RPCCallQueue is full

2014-11-10 Thread Ming Ma (JIRA)
Ming Ma created HADOOP-11295:


 Summary: RPC Reader thread can't be shutdowned if RPCCallQueue is 
full
 Key: HADOOP-11295
 URL: https://issues.apache.org/jira/browse/HADOOP-11295
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Ming Ma


You RPC server is asked to stop when RPCCallQueue is full, {{reader.join()}} 
will just wait there. That is because

1. The reader thread is blocked on {{callQueue.put(call);}}.
2. When RPC server is asked to stop, it will interrupt all handler threads and 
thus no threads will drain the callQueue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Resolved] (HADOOP-11290) Typo on web page http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

2014-11-10 Thread Akira AJISAKA (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11290?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira AJISAKA resolved HADOOP-11290.

Resolution: Duplicate

> Typo on web page 
> http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
> 
>
> Key: HADOOP-11290
> URL: https://issues.apache.org/jira/browse/HADOOP-11290
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: documentation
>Affects Versions: 2.3.0
>Reporter: Jason Pyeron
>Priority: Minor
>
> Once you installed the prerequisite packages use the standard hadoop pom.xml 
> file and pass along the native flag to build the native hadoop library:
>$ mvn package -Pdist,native -Dskiptests -Dtar
> -Dskiptests
> should be 
> -DskipTests



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: [VOTE] Release Apache Hadoop 2.6.0

2014-11-10 Thread Arun Murthy
Duh!

$ chmod a+r *

Please try now. Thanks!

Arun


On Mon, Nov 10, 2014 at 7:06 PM, Tsuyoshi OZAWA 
wrote:

> Hi Arun,
>
> Could you confirm the link and permission to the files is correct? I
> got a following error:
>
>
> Forbidden
> You don't have permission to access
> /~acmurthy/hadoop-2.6.0-rc0/hadoop-2.6.0-src.tar.gz on this server.
>
> On Tue, Nov 11, 2014 at 11:52 AM, Arun C Murthy 
> wrote:
> > Folks,
> >
> > I've created a release candidate (rc0) for hadoop-2.6.0 that I would
> like to see released.
> >
> > The RC is available at:
> http://people.apache.org/~acmurthy/hadoop-2.6.0-rc0
> > The RC tag in git is: release-2.6.0-rc0
> >
> > The maven artifacts are available via repository.apache.org at
> https://repository.apache.org/content/repositories/orgapachehadoop-1012.
> >
> > Please try the release and vote; the vote will run for the usual 5 days.
> >
> > thanks,
> > Arun
> >
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
>
>
>
> --
> - Tsuyoshi
>



-- 

--
Arun C. Murthy
Hortonworks Inc.
http://hortonworks.com/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Ted Yu
Have created Jenkins jobs for common, hdfs and mapreduce components against
Java8.

FYI

On Mon, Nov 10, 2014 at 4:24 PM, Ted Yu  wrote:

> Created Hadoop-Yarn-trunk-Java8 and triggered a build.
>
> Can create Jenkins builds for other projects later.
>
> Cheers
>
> On Mon, Nov 10, 2014 at 1:26 PM, Andrew Wang 
> wrote:
>
>> Good idea, we should probably have such a build anyway.
>>
>> Thanks,
>> Andrew
>>
>> On Mon, Nov 10, 2014 at 1:24 PM, Ted Yu  wrote:
>>
>> > Should there be a Jenkins job building trunk branch against Java 1.8
>> after
>> > the fix goes in ?
>> >
>> > That way we can easily see any regression.
>> >
>> > Cheers
>> >
>> > On Mon, Nov 10, 2014 at 12:54 PM, Chen He  wrote:
>> >
>> > > Invite Andrew Purtell to HADOOP-11292, My fix is just disable the
>> > "doclint"
>> > > in hadoop project. Then, we can still keep current docs without
>> change.
>> > >
>> > > On Mon, Nov 10, 2014 at 12:51 PM, Andrew Wang <
>> andrew.w...@cloudera.com>
>> > > wrote:
>> > >
>> > > > I think Andrew Purtell had some patches to clean up javadoc errors
>> for
>> > > > JDK8, might be worth asking him before diving in yourself.
>> > > >
>> > > > On Mon, Nov 10, 2014 at 12:04 PM, Chen He 
>> wrote:
>> > > >
>> > > > > Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix
>> it.
>> > > > >
>> > > > > On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu 
>> > wrote:
>> > > > >
>> > > > > > I can reproduce this.
>> > > > > >
>> > > > > > Tried what was suggested here:
>> > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
>> > > > > >
>> > > > > > Though it doesn't seem to work.
>> > > > > >
>> > > > > > On Mon, Nov 10, 2014 at 11:32 AM, Chen He 
>> > wrote:
>> > > > > >
>> > > > > > > "mvn package -Pdist -Dtar -DskipTests" reports following error
>> > > based
>> > > > on
>> > > > > > > latest trunk:
>> > > > > > >
>> > > > > > > [INFO] BUILD FAILURE
>> > > > > > >
>> > > > > > > [INFO]
>> > > > > > >
>> > > > >
>> > >
>> 
>> > > > > > >
>> > > > > > > [INFO] Total time: 11.010 s
>> > > > > > >
>> > > > > > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
>> > > > > > >
>> > > > > > > [INFO] Final Memory: 51M/555M
>> > > > > > >
>> > > > > > > [INFO]
>> > > > > > >
>> > > > >
>> > >
>> 
>> > > > > > >
>> > > > > > > [ERROR] Failed to execute goal
>> > > > > > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
>> > > > > (module-javadocs)
>> > > > > > > on project hadoop-maven-plugins: MavenReportException: Error
>> > while
>> > > > > > creating
>> > > > > > > archive:
>> > > > > > >
>> > > > > > > [ERROR] Exit code: 1 -
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
>> > > > > > > error: unknown tag: String
>> > > > > > >
>> > > > > > > [ERROR] * @param command List containing command and
>> all
>> > > > > > arguments
>> > > > > > >
>> > > > > > > [ERROR] ^
>> > > > > > >
>> > > > > > > [ERROR]
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
>> > > > > > > error: unknown tag: String
>> > > > > > >
>> > > > > > > [ERROR] * @param output List in/out parameter to
>> receive
>> > > > > command
>> > > > > > > output
>> > > > > > >
>> > > > > > > [ERROR] ^
>> > > > > > >
>> > > > > > > [ERROR]
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
>> > > > > > > error: unknown tag: File
>> > > > > > >
>> > > > > > > [ERROR] * @return List containing every element of the
>> > > FileSet
>> > > > > as a
>> > > > > > > File
>> > > > > > >
>> > > > > > > [ERROR] ^
>> > > > > > >
>> > > > > > > [ERROR]
>> > > > > > >
>> > > > > > > [ERROR] Command line was:
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
>> > > > > > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=
>> > www-proxy.us.oracle.com
>> > > > > > > -J-Dhttp.proxyPort=80 @options @packages
>> > > > > > >
>> > > > > > > [ERROR]
>> > > > > > >
>> > > > > > > [ERROR] Refer to the generated Javadoc files in
>> > > > > > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
>> > > > > > >
>> > > > > > > [ERROR] -> [Help 1]
>> > > > > > >
>> > > > > > > [ERROR]
>> > > > > > >
>> > > > > > > [ERROR] To see the full stack trace of the errors, re-run
>> Maven
>> > > with
>> > > > > the
>> > > > > > -e
>> > > > > > > switch.
>> > > > > > >
>> > > > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
>> > > > logging.
>> > > > > 

Re: [VOTE] Release Apache Hadoop 2.6.0

2014-11-10 Thread Tsuyoshi OZAWA
Hi Arun,

Could you confirm the link and permission to the files is correct? I
got a following error:


Forbidden
You don't have permission to access
/~acmurthy/hadoop-2.6.0-rc0/hadoop-2.6.0-src.tar.gz on this server.

On Tue, Nov 11, 2014 at 11:52 AM, Arun C Murthy  wrote:
> Folks,
>
> I've created a release candidate (rc0) for hadoop-2.6.0 that I would like to 
> see released.
>
> The RC is available at: http://people.apache.org/~acmurthy/hadoop-2.6.0-rc0
> The RC tag in git is: release-2.6.0-rc0
>
> The maven artifacts are available via repository.apache.org at 
> https://repository.apache.org/content/repositories/orgapachehadoop-1012.
>
> Please try the release and vote; the vote will run for the usual 5 days.
>
> thanks,
> Arun
>
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.



-- 
- Tsuyoshi


[VOTE] Release Apache Hadoop 2.6.0

2014-11-10 Thread Arun C Murthy
Folks,

I've created a release candidate (rc0) for hadoop-2.6.0 that I would like to 
see released.

The RC is available at: http://people.apache.org/~acmurthy/hadoop-2.6.0-rc0
The RC tag in git is: release-2.6.0-rc0

The maven artifacts are available via repository.apache.org at 
https://repository.apache.org/content/repositories/orgapachehadoop-1012.

Please try the release and vote; the vote will run for the usual 5 days.

thanks,
Arun


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Ted Yu
Created Hadoop-Yarn-trunk-Java8 and triggered a build.

Can create Jenkins builds for other projects later.

Cheers

On Mon, Nov 10, 2014 at 1:26 PM, Andrew Wang 
wrote:

> Good idea, we should probably have such a build anyway.
>
> Thanks,
> Andrew
>
> On Mon, Nov 10, 2014 at 1:24 PM, Ted Yu  wrote:
>
> > Should there be a Jenkins job building trunk branch against Java 1.8
> after
> > the fix goes in ?
> >
> > That way we can easily see any regression.
> >
> > Cheers
> >
> > On Mon, Nov 10, 2014 at 12:54 PM, Chen He  wrote:
> >
> > > Invite Andrew Purtell to HADOOP-11292, My fix is just disable the
> > "doclint"
> > > in hadoop project. Then, we can still keep current docs without change.
> > >
> > > On Mon, Nov 10, 2014 at 12:51 PM, Andrew Wang <
> andrew.w...@cloudera.com>
> > > wrote:
> > >
> > > > I think Andrew Purtell had some patches to clean up javadoc errors
> for
> > > > JDK8, might be worth asking him before diving in yourself.
> > > >
> > > > On Mon, Nov 10, 2014 at 12:04 PM, Chen He  wrote:
> > > >
> > > > > Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix
> it.
> > > > >
> > > > > On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu 
> > wrote:
> > > > >
> > > > > > I can reproduce this.
> > > > > >
> > > > > > Tried what was suggested here:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
> > > > > >
> > > > > > Though it doesn't seem to work.
> > > > > >
> > > > > > On Mon, Nov 10, 2014 at 11:32 AM, Chen He 
> > wrote:
> > > > > >
> > > > > > > "mvn package -Pdist -Dtar -DskipTests" reports following error
> > > based
> > > > on
> > > > > > > latest trunk:
> > > > > > >
> > > > > > > [INFO] BUILD FAILURE
> > > > > > >
> > > > > > > [INFO]
> > > > > > >
> > > > >
> > >
> 
> > > > > > >
> > > > > > > [INFO] Total time: 11.010 s
> > > > > > >
> > > > > > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> > > > > > >
> > > > > > > [INFO] Final Memory: 51M/555M
> > > > > > >
> > > > > > > [INFO]
> > > > > > >
> > > > >
> > >
> 
> > > > > > >
> > > > > > > [ERROR] Failed to execute goal
> > > > > > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
> > > > > (module-javadocs)
> > > > > > > on project hadoop-maven-plugins: MavenReportException: Error
> > while
> > > > > > creating
> > > > > > > archive:
> > > > > > >
> > > > > > > [ERROR] Exit code: 1 -
> > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > > > > > > error: unknown tag: String
> > > > > > >
> > > > > > > [ERROR] * @param command List containing command and
> all
> > > > > > arguments
> > > > > > >
> > > > > > > [ERROR] ^
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > > > > > > error: unknown tag: String
> > > > > > >
> > > > > > > [ERROR] * @param output List in/out parameter to
> receive
> > > > > command
> > > > > > > output
> > > > > > >
> > > > > > > [ERROR] ^
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > > > > > > error: unknown tag: File
> > > > > > >
> > > > > > > [ERROR] * @return List containing every element of the
> > > FileSet
> > > > > as a
> > > > > > > File
> > > > > > >
> > > > > > > [ERROR] ^
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > > [ERROR] Command line was:
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > > > > > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=
> > www-proxy.us.oracle.com
> > > > > > > -J-Dhttp.proxyPort=80 @options @packages
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > > [ERROR] Refer to the generated Javadoc files in
> > > > > > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> > > > > > >
> > > > > > > [ERROR] -> [Help 1]
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > > with
> > > > > the
> > > > > > -e
> > > > > > > switch.
> > > > > > >
> > > > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> > > > logging.
> > > > > > >
> > > > > > > [ERROR]
> > > > > > >
> > > > > > > [ERROR] For more information about the errors and possible
> > > solutions,
> > > > > > > please read the following articles:
> > > > > > >
> > > > > > > [ERROR] [Help 1]
> > > > > > >
> > > > >
> > >
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> >

[jira] [Created] (HADOOP-11294) Nfs3FileAttributes should not change the values of nlink and size in the constructor

2014-11-10 Thread Brandon Li (JIRA)
Brandon Li created HADOOP-11294:
---

 Summary: Nfs3FileAttributes should not change the values of nlink 
and size in the constructor 
 Key: HADOOP-11294
 URL: https://issues.apache.org/jira/browse/HADOOP-11294
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Brandon Li
Assignee: Brandon Li
Priority: Minor


In stead, it should just take the values passed in.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Andrew Wang
Good idea, we should probably have such a build anyway.

Thanks,
Andrew

On Mon, Nov 10, 2014 at 1:24 PM, Ted Yu  wrote:

> Should there be a Jenkins job building trunk branch against Java 1.8 after
> the fix goes in ?
>
> That way we can easily see any regression.
>
> Cheers
>
> On Mon, Nov 10, 2014 at 12:54 PM, Chen He  wrote:
>
> > Invite Andrew Purtell to HADOOP-11292, My fix is just disable the
> "doclint"
> > in hadoop project. Then, we can still keep current docs without change.
> >
> > On Mon, Nov 10, 2014 at 12:51 PM, Andrew Wang 
> > wrote:
> >
> > > I think Andrew Purtell had some patches to clean up javadoc errors for
> > > JDK8, might be worth asking him before diving in yourself.
> > >
> > > On Mon, Nov 10, 2014 at 12:04 PM, Chen He  wrote:
> > >
> > > > Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix it.
> > > >
> > > > On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu 
> wrote:
> > > >
> > > > > I can reproduce this.
> > > > >
> > > > > Tried what was suggested here:
> > > > >
> > > > >
> > > >
> > >
> >
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
> > > > >
> > > > > Though it doesn't seem to work.
> > > > >
> > > > > On Mon, Nov 10, 2014 at 11:32 AM, Chen He 
> wrote:
> > > > >
> > > > > > "mvn package -Pdist -Dtar -DskipTests" reports following error
> > based
> > > on
> > > > > > latest trunk:
> > > > > >
> > > > > > [INFO] BUILD FAILURE
> > > > > >
> > > > > > [INFO]
> > > > > >
> > > >
> > 
> > > > > >
> > > > > > [INFO] Total time: 11.010 s
> > > > > >
> > > > > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> > > > > >
> > > > > > [INFO] Final Memory: 51M/555M
> > > > > >
> > > > > > [INFO]
> > > > > >
> > > >
> > 
> > > > > >
> > > > > > [ERROR] Failed to execute goal
> > > > > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
> > > > (module-javadocs)
> > > > > > on project hadoop-maven-plugins: MavenReportException: Error
> while
> > > > > creating
> > > > > > archive:
> > > > > >
> > > > > > [ERROR] Exit code: 1 -
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > > > > > error: unknown tag: String
> > > > > >
> > > > > > [ERROR] * @param command List containing command and all
> > > > > arguments
> > > > > >
> > > > > > [ERROR] ^
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > > > > > error: unknown tag: String
> > > > > >
> > > > > > [ERROR] * @param output List in/out parameter to receive
> > > > command
> > > > > > output
> > > > > >
> > > > > > [ERROR] ^
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > > > > > error: unknown tag: File
> > > > > >
> > > > > > [ERROR] * @return List containing every element of the
> > FileSet
> > > > as a
> > > > > > File
> > > > > >
> > > > > > [ERROR] ^
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > > [ERROR] Command line was:
> > > > > >
> > > > >
> > > >
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > > > > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=
> www-proxy.us.oracle.com
> > > > > > -J-Dhttp.proxyPort=80 @options @packages
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > > [ERROR] Refer to the generated Javadoc files in
> > > > > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> > > > > >
> > > > > > [ERROR] -> [Help 1]
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > with
> > > > the
> > > > > -e
> > > > > > switch.
> > > > > >
> > > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> > > logging.
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > > [ERROR] For more information about the errors and possible
> > solutions,
> > > > > > please read the following articles:
> > > > > >
> > > > > > [ERROR] [Help 1]
> > > > > >
> > > >
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> > > > > >
> > > > > > [ERROR]
> > > > > >
> > > > > > [ERROR] After correcting the problems, you can resume the build
> > with
> > > > the
> > > > > > command
> > > > > >
> > > > > > [ERROR]   mvn  -rf :hadoop-maven-plugins
> > > > > >
> > > > >
> > > >
> > >
> >
>


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Ted Yu
Should there be a Jenkins job building trunk branch against Java 1.8 after
the fix goes in ?

That way we can easily see any regression.

Cheers

On Mon, Nov 10, 2014 at 12:54 PM, Chen He  wrote:

> Invite Andrew Purtell to HADOOP-11292, My fix is just disable the "doclint"
> in hadoop project. Then, we can still keep current docs without change.
>
> On Mon, Nov 10, 2014 at 12:51 PM, Andrew Wang 
> wrote:
>
> > I think Andrew Purtell had some patches to clean up javadoc errors for
> > JDK8, might be worth asking him before diving in yourself.
> >
> > On Mon, Nov 10, 2014 at 12:04 PM, Chen He  wrote:
> >
> > > Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix it.
> > >
> > > On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu  wrote:
> > >
> > > > I can reproduce this.
> > > >
> > > > Tried what was suggested here:
> > > >
> > > >
> > >
> >
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
> > > >
> > > > Though it doesn't seem to work.
> > > >
> > > > On Mon, Nov 10, 2014 at 11:32 AM, Chen He  wrote:
> > > >
> > > > > "mvn package -Pdist -Dtar -DskipTests" reports following error
> based
> > on
> > > > > latest trunk:
> > > > >
> > > > > [INFO] BUILD FAILURE
> > > > >
> > > > > [INFO]
> > > > >
> > >
> 
> > > > >
> > > > > [INFO] Total time: 11.010 s
> > > > >
> > > > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> > > > >
> > > > > [INFO] Final Memory: 51M/555M
> > > > >
> > > > > [INFO]
> > > > >
> > >
> 
> > > > >
> > > > > [ERROR] Failed to execute goal
> > > > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
> > > (module-javadocs)
> > > > > on project hadoop-maven-plugins: MavenReportException: Error while
> > > > creating
> > > > > archive:
> > > > >
> > > > > [ERROR] Exit code: 1 -
> > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > > > > error: unknown tag: String
> > > > >
> > > > > [ERROR] * @param command List containing command and all
> > > > arguments
> > > > >
> > > > > [ERROR] ^
> > > > >
> > > > > [ERROR]
> > > > >
> > > > >
> > > >
> > >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > > > > error: unknown tag: String
> > > > >
> > > > > [ERROR] * @param output List in/out parameter to receive
> > > command
> > > > > output
> > > > >
> > > > > [ERROR] ^
> > > > >
> > > > > [ERROR]
> > > > >
> > > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > > > > error: unknown tag: File
> > > > >
> > > > > [ERROR] * @return List containing every element of the
> FileSet
> > > as a
> > > > > File
> > > > >
> > > > > [ERROR] ^
> > > > >
> > > > > [ERROR]
> > > > >
> > > > > [ERROR] Command line was:
> > > > >
> > > >
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > > > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
> > > > > -J-Dhttp.proxyPort=80 @options @packages
> > > > >
> > > > > [ERROR]
> > > > >
> > > > > [ERROR] Refer to the generated Javadoc files in
> > > > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> > > > >
> > > > > [ERROR] -> [Help 1]
> > > > >
> > > > > [ERROR]
> > > > >
> > > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> with
> > > the
> > > > -e
> > > > > switch.
> > > > >
> > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> > logging.
> > > > >
> > > > > [ERROR]
> > > > >
> > > > > [ERROR] For more information about the errors and possible
> solutions,
> > > > > please read the following articles:
> > > > >
> > > > > [ERROR] [Help 1]
> > > > >
> > >
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> > > > >
> > > > > [ERROR]
> > > > >
> > > > > [ERROR] After correcting the problems, you can resume the build
> with
> > > the
> > > > > command
> > > > >
> > > > > [ERROR]   mvn  -rf :hadoop-maven-plugins
> > > > >
> > > >
> > >
> >
>


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Chen He
Invite Andrew Purtell to HADOOP-11292, My fix is just disable the "doclint"
in hadoop project. Then, we can still keep current docs without change.

On Mon, Nov 10, 2014 at 12:51 PM, Andrew Wang 
wrote:

> I think Andrew Purtell had some patches to clean up javadoc errors for
> JDK8, might be worth asking him before diving in yourself.
>
> On Mon, Nov 10, 2014 at 12:04 PM, Chen He  wrote:
>
> > Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix it.
> >
> > On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu  wrote:
> >
> > > I can reproduce this.
> > >
> > > Tried what was suggested here:
> > >
> > >
> >
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
> > >
> > > Though it doesn't seem to work.
> > >
> > > On Mon, Nov 10, 2014 at 11:32 AM, Chen He  wrote:
> > >
> > > > "mvn package -Pdist -Dtar -DskipTests" reports following error based
> on
> > > > latest trunk:
> > > >
> > > > [INFO] BUILD FAILURE
> > > >
> > > > [INFO]
> > > >
> > 
> > > >
> > > > [INFO] Total time: 11.010 s
> > > >
> > > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> > > >
> > > > [INFO] Final Memory: 51M/555M
> > > >
> > > > [INFO]
> > > >
> > 
> > > >
> > > > [ERROR] Failed to execute goal
> > > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
> > (module-javadocs)
> > > > on project hadoop-maven-plugins: MavenReportException: Error while
> > > creating
> > > > archive:
> > > >
> > > > [ERROR] Exit code: 1 -
> > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > > > error: unknown tag: String
> > > >
> > > > [ERROR] * @param command List containing command and all
> > > arguments
> > > >
> > > > [ERROR] ^
> > > >
> > > > [ERROR]
> > > >
> > > >
> > >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > > > error: unknown tag: String
> > > >
> > > > [ERROR] * @param output List in/out parameter to receive
> > command
> > > > output
> > > >
> > > > [ERROR] ^
> > > >
> > > > [ERROR]
> > > >
> > > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > > > error: unknown tag: File
> > > >
> > > > [ERROR] * @return List containing every element of the FileSet
> > as a
> > > > File
> > > >
> > > > [ERROR] ^
> > > >
> > > > [ERROR]
> > > >
> > > > [ERROR] Command line was:
> > > >
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
> > > > -J-Dhttp.proxyPort=80 @options @packages
> > > >
> > > > [ERROR]
> > > >
> > > > [ERROR] Refer to the generated Javadoc files in
> > > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> > > >
> > > > [ERROR] -> [Help 1]
> > > >
> > > > [ERROR]
> > > >
> > > > [ERROR] To see the full stack trace of the errors, re-run Maven with
> > the
> > > -e
> > > > switch.
> > > >
> > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> > > >
> > > > [ERROR]
> > > >
> > > > [ERROR] For more information about the errors and possible solutions,
> > > > please read the following articles:
> > > >
> > > > [ERROR] [Help 1]
> > > >
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> > > >
> > > > [ERROR]
> > > >
> > > > [ERROR] After correcting the problems, you can resume the build with
> > the
> > > > command
> > > >
> > > > [ERROR]   mvn  -rf :hadoop-maven-plugins
> > > >
> > >
> >
>


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Andrew Wang
I think Andrew Purtell had some patches to clean up javadoc errors for
JDK8, might be worth asking him before diving in yourself.

On Mon, Nov 10, 2014 at 12:04 PM, Chen He  wrote:

> Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix it.
>
> On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu  wrote:
>
> > I can reproduce this.
> >
> > Tried what was suggested here:
> >
> >
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
> >
> > Though it doesn't seem to work.
> >
> > On Mon, Nov 10, 2014 at 11:32 AM, Chen He  wrote:
> >
> > > "mvn package -Pdist -Dtar -DskipTests" reports following error based on
> > > latest trunk:
> > >
> > > [INFO] BUILD FAILURE
> > >
> > > [INFO]
> > >
> 
> > >
> > > [INFO] Total time: 11.010 s
> > >
> > > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> > >
> > > [INFO] Final Memory: 51M/555M
> > >
> > > [INFO]
> > >
> 
> > >
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar
> (module-javadocs)
> > > on project hadoop-maven-plugins: MavenReportException: Error while
> > creating
> > > archive:
> > >
> > > [ERROR] Exit code: 1 -
> > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > > error: unknown tag: String
> > >
> > > [ERROR] * @param command List containing command and all
> > arguments
> > >
> > > [ERROR] ^
> > >
> > > [ERROR]
> > >
> > >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > > error: unknown tag: String
> > >
> > > [ERROR] * @param output List in/out parameter to receive
> command
> > > output
> > >
> > > [ERROR] ^
> > >
> > > [ERROR]
> > >
> > >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > > error: unknown tag: File
> > >
> > > [ERROR] * @return List containing every element of the FileSet
> as a
> > > File
> > >
> > > [ERROR] ^
> > >
> > > [ERROR]
> > >
> > > [ERROR] Command line was:
> > >
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
> > > -J-Dhttp.proxyPort=80 @options @packages
> > >
> > > [ERROR]
> > >
> > > [ERROR] Refer to the generated Javadoc files in
> > > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> > >
> > > [ERROR] -> [Help 1]
> > >
> > > [ERROR]
> > >
> > > [ERROR] To see the full stack trace of the errors, re-run Maven with
> the
> > -e
> > > switch.
> > >
> > > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > >
> > > [ERROR]
> > >
> > > [ERROR] For more information about the errors and possible solutions,
> > > please read the following articles:
> > >
> > > [ERROR] [Help 1]
> > >
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> > >
> > > [ERROR]
> > >
> > > [ERROR] After correcting the problems, you can resume the build with
> the
> > > command
> > >
> > > [ERROR]   mvn  -rf :hadoop-maven-plugins
> > >
> >
>


[jira] [Created] (HADOOP-11293) Factor OSType out from Shell

2014-11-10 Thread Yongjun Zhang (JIRA)
Yongjun Zhang created HADOOP-11293:
--

 Summary: Factor OSType out from Shell
 Key: HADOOP-11293
 URL: https://issues.apache.org/jira/browse/HADOOP-11293
 Project: Hadoop Common
  Issue Type: Improvement
  Components: util
Reporter: Yongjun Zhang
Assignee: Yongjun Zhang


Currently the code that detects the OS type is located in Shell.java. Code that 
need to check OS type refers to Shell, even if no other stuff of Shell is 
needed. 

I am proposing to refactor OSType out to  its own class, so to make the OSType 
easier to access and the dependency cleaner.
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11292) "mvm package" reports error when using Java 1.8

2014-11-10 Thread Chen He (JIRA)
Chen He created HADOOP-11292:


 Summary: "mvm package" reports error when using Java 1.8 
 Key: HADOOP-11292
 URL: https://issues.apache.org/jira/browse/HADOOP-11292
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Chen He
Assignee: Chen He


mvn package -Pdist -Dtar -DskipTests" reports following error based on latest 
trunk:

[INFO] BUILD FAILURE

[INFO] 

[INFO] Total time: 11.010 s

[INFO] Finished at: 2014-11-10T11:23:49-08:00

[INFO] Final Memory: 51M/555M

[INFO] 

[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on 
project hadoop-maven-plugins: MavenReportException: Error while creating 
archive:

[ERROR] Exit code: 1 - 
./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
 error: unknown tag: String

[ERROR] * @param command List containing command and all arguments

[ERROR] ^

[ERROR] 
./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
 error: unknown tag: String

[ERROR] * @param output List in/out parameter to receive command output

[ERROR] ^

[ERROR] 
./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
 error: unknown tag: File

[ERROR] * @return List containing every element of the FileSet as a File

[ERROR] ^

[ERROR] 

[ERROR] Command line was: 
/Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc 
-J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com 
-J-Dhttp.proxyPort=80 @options @packages

[ERROR] 

[ERROR] Refer to the generated Javadoc files in 
'./hadoop/hadoop/hadoop-maven-plugins/target' dir.

[ERROR] -> [Help 1]

[ERROR] 

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR] 

[ERROR] For more information about the errors and possible solutions, please 
read the following articles:

[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR] 

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn  -rf :hadoop-maven-plugins



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Chen He
Thanks, Ted Yu. I will create a JIRA for it. I find a way to fix it.

On Mon, Nov 10, 2014 at 11:50 AM, Ted Yu  wrote:

> I can reproduce this.
>
> Tried what was suggested here:
>
> http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete
>
> Though it doesn't seem to work.
>
> On Mon, Nov 10, 2014 at 11:32 AM, Chen He  wrote:
>
> > "mvn package -Pdist -Dtar -DskipTests" reports following error based on
> > latest trunk:
> >
> > [INFO] BUILD FAILURE
> >
> > [INFO]
> > 
> >
> > [INFO] Total time: 11.010 s
> >
> > [INFO] Finished at: 2014-11-10T11:23:49-08:00
> >
> > [INFO] Final Memory: 51M/555M
> >
> > [INFO]
> > 
> >
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs)
> > on project hadoop-maven-plugins: MavenReportException: Error while
> creating
> > archive:
> >
> > [ERROR] Exit code: 1 -
> >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> > error: unknown tag: String
> >
> > [ERROR] * @param command List containing command and all
> arguments
> >
> > [ERROR] ^
> >
> > [ERROR]
> >
> >
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> > error: unknown tag: String
> >
> > [ERROR] * @param output List in/out parameter to receive command
> > output
> >
> > [ERROR] ^
> >
> > [ERROR]
> >
> >
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> > error: unknown tag: File
> >
> > [ERROR] * @return List containing every element of the FileSet as a
> > File
> >
> > [ERROR] ^
> >
> > [ERROR]
> >
> > [ERROR] Command line was:
> >
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> > -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
> > -J-Dhttp.proxyPort=80 @options @packages
> >
> > [ERROR]
> >
> > [ERROR] Refer to the generated Javadoc files in
> > './hadoop/hadoop/hadoop-maven-plugins/target' dir.
> >
> > [ERROR] -> [Help 1]
> >
> > [ERROR]
> >
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > switch.
> >
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >
> > [ERROR]
> >
> > [ERROR] For more information about the errors and possible solutions,
> > please read the following articles:
> >
> > [ERROR] [Help 1]
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> >
> > [ERROR]
> >
> > [ERROR] After correcting the problems, you can resume the build with the
> > command
> >
> > [ERROR]   mvn  -rf :hadoop-maven-plugins
> >
>


[jira] [Created] (HADOOP-11291) Log the cause of SASL connection failures

2014-11-10 Thread Stephen Chu (JIRA)
Stephen Chu created HADOOP-11291:


 Summary: Log the cause of SASL connection failures
 Key: HADOOP-11291
 URL: https://issues.apache.org/jira/browse/HADOOP-11291
 Project: Hadoop Common
  Issue Type: Improvement
  Components: security
Affects Versions: 2.5.0
Reporter: Stephen Chu
Assignee: Stephen Chu
Priority: Minor


{{UGI#doAs}} will no longer log a PriviledgedActionException unless 
LOG.isDebugEnabled() == true. HADOOP-10015 made this change because it was 
decided that users calling {{UGI#doAs}} should be responsible for logging the 
error when catching an exception. Also, the log was confusing in certain 
situations (see more details in HADOOP-10015).

However, as Daryn noted, this log message was very helpful in cases of 
debugging security issues.

As an example, we would use to see this in the DN logs before HADOOP-10015:
{code}
2014-10-20 11:28:02,112 WARN org.apache.hadoop.security.UserGroupInformation: 
PriviledgedActionException as:hdfs/hosta@realm.com (auth:KERBEROS) 
cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by 
GSSException: No valid credentials provided (Mechanism level: Generic error 
(description in e-text) (60) - NO PREAUTH)]
2014-10-20 11:28:02,112 WARN org.apache.hadoop.ipc.Client: Couldn't setup 
connection for hdfs/hosta@realm.com to hostB.com/101.01.010:8022
2014-10-20 11:28:02,112 WARN org.apache.hadoop.security.UserGroupInformation: 
PriviledgedActionException as:hdfs/hosta@realm.com (auth:KERBEROS) 
cause:java.io.IOException: Couldn't setup connection for 
hdfs/hosta@realm.com to hostB.com/101.01.010:8022
{code}

After the fix went in, the DN was upgraded, and only logs:
{code}
2014-10-20 14:11:40,712 WARN org.apache.hadoop.ipc.Client: Couldn't setup 
connection for hdfs/hosta@realm.com to hostB.com/101.01.010:8022
2014-10-20 14:11:40,713 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
Problem connecting to server: hostB.com/101.01.010:8022
{code}

It'd be good to add more logging information about the cause of a SASL 
connection failure.

Thanks to [~qwertymaniac] for reporting this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Ted Yu
I can reproduce this.

Tried what was suggested here:
http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete

Though it doesn't seem to work.

On Mon, Nov 10, 2014 at 11:32 AM, Chen He  wrote:

> "mvn package -Pdist -Dtar -DskipTests" reports following error based on
> latest trunk:
>
> [INFO] BUILD FAILURE
>
> [INFO]
> 
>
> [INFO] Total time: 11.010 s
>
> [INFO] Finished at: 2014-11-10T11:23:49-08:00
>
> [INFO] Final Memory: 51M/555M
>
> [INFO]
> 
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs)
> on project hadoop-maven-plugins: MavenReportException: Error while creating
> archive:
>
> [ERROR] Exit code: 1 -
>
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
> error: unknown tag: String
>
> [ERROR] * @param command List containing command and all arguments
>
> [ERROR] ^
>
> [ERROR]
>
> ./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
> error: unknown tag: String
>
> [ERROR] * @param output List in/out parameter to receive command
> output
>
> [ERROR] ^
>
> [ERROR]
>
> ./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
> error: unknown tag: File
>
> [ERROR] * @return List containing every element of the FileSet as a
> File
>
> [ERROR] ^
>
> [ERROR]
>
> [ERROR] Command line was:
> /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
> -J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
> -J-Dhttp.proxyPort=80 @options @packages
>
> [ERROR]
>
> [ERROR] Refer to the generated Javadoc files in
> './hadoop/hadoop/hadoop-maven-plugins/target' dir.
>
> [ERROR] -> [Help 1]
>
> [ERROR]
>
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
> switch.
>
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>
> [ERROR]
>
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
>
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>
> [ERROR]
>
> [ERROR] After correcting the problems, you can resume the build with the
> command
>
> [ERROR]   mvn  -rf :hadoop-maven-plugins
>


Hadoop maven packaging does not work on JAVA 1.8?

2014-11-10 Thread Chen He
"mvn package -Pdist -Dtar -DskipTests" reports following error based on
latest trunk:

[INFO] BUILD FAILURE

[INFO]


[INFO] Total time: 11.010 s

[INFO] Finished at: 2014-11-10T11:23:49-08:00

[INFO] Final Memory: 51M/555M

[INFO]


[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs)
on project hadoop-maven-plugins: MavenReportException: Error while creating
archive:

[ERROR] Exit code: 1 -
./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:45:
error: unknown tag: String

[ERROR] * @param command List containing command and all arguments

[ERROR] ^

[ERROR]
./develop/hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/Exec.java:46:
error: unknown tag: String

[ERROR] * @param output List in/out parameter to receive command
output

[ERROR] ^

[ERROR]
./hadoop/hadoop/hadoop-maven-plugins/src/main/java/org/apache/hadoop/maven/plugin/util/FileSetUtils.java:50:
error: unknown tag: File

[ERROR] * @return List containing every element of the FileSet as a
File

[ERROR] ^

[ERROR]

[ERROR] Command line was:
/Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/bin/javadoc
-J-Dhttp.proxySet=true -J-Dhttp.proxyHost=www-proxy.us.oracle.com
-J-Dhttp.proxyPort=80 @options @packages

[ERROR]

[ERROR] Refer to the generated Javadoc files in
'./hadoop/hadoop/hadoop-maven-plugins/target' dir.

[ERROR] -> [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions,
please read the following articles:

[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the
command

[ERROR]   mvn  -rf :hadoop-maven-plugins


[jira] [Created] (HADOOP-11290) Typo on web page http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

2014-11-10 Thread Jason Pyeron (JIRA)
Jason Pyeron created HADOOP-11290:
-

 Summary: Typo on web page 
http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
 Key: HADOOP-11290
 URL: https://issues.apache.org/jira/browse/HADOOP-11290
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 2.3.0
Reporter: Jason Pyeron
Priority: Minor


Once you installed the prerequisite packages use the standard hadoop pom.xml 
file and pass along the native flag to build the native hadoop library:

   $ mvn package -Pdist,native -Dskiptests -Dtar


-Dskiptests

should be 

-DskipTests



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: Guava

2014-11-10 Thread Colin McCabe
I'm usually an advocate for getting rid of unnecessary dependencies
(cough, jetty, cough), but a lot of the things in Guava are really
useful.

Immutable collections, BiMap, Multisets, Arrays#asList, the stuff for
writing hashCode() and equals(), String#Joiner, the list goes on.  We
particularly use the Cache/CacheBuilder stuff a lot in HDFS to get
maps with LRU eviction without writing a lot of boilerplate.  The QJM
stuff uses ListenableFuture a lot, although perhaps we could come up
with our own equivalent for that.

On Mon, Nov 10, 2014 at 9:26 AM, Alejandro Abdelnur  wrote:
> IMO we should:
>
> 1* have a clean and thin client API JAR (which does not drag any 3rd party
> dependencies, or a well defined small set -i.e. slf4j & log4j-)
> 2* have a client implementation that uses a classloader to isolate client
> impl 3rd party deps from app dependencies.
>
> #2 can be done using a stock URLClassLoader (i would just subclass it to
> forbid packages in the API JAR and exposed 3rd parties to be loaded from
> the app JAR)
>
> #1 is the tricky thing as our current API modules don't have a clean
> API/impl separation.
>
> thx
> PS: If folks are interested in pursing this, I can put together a prototype
> of how  #2 would work (I don't think it will be more than 200 lines of code)

Absolutely, I agree that we should not be using Guava types in public
APIs.  Guava has not been very responsible with backwards
compatibility, that much is clear.

A client / server jar separation is an interesting idea.  But then we
still have to get rid of Guava and other library deps in the client
jars.  I think it would be more work than it seems.  For example, the
HDFS client uses Guava Cache a lot, so we'd have to write our own
version of this.

Can't we just shade this stuff?  Has anyone tried shading Hadoop's Guava?

best,
Colin


>
>
> On Mon, Nov 10, 2014 at 5:18 AM, Steve Loughran 
> wrote:
>
>> Yes, Guava is a constant pain; there's lots of open JIRAs related to it, as
>> its the one we can't seamlessly upgrade. Not unless we do our own fork and
>> reinsert the missing classes.
>>
>> The most common uses in the code are
>>
>> @VisibleForTesting (easily replicated)
>> and the Precondition.check() operations
>>
>> The latter is also easily swapped out, and we could even add the check they
>> forgot:
>> Preconditions.checkArgNotNull(argname, arg)
>>
>>
>> These are easy; its the more complex data structures that matter more.
>>
>> I think for Hadoop 2.7 & java 7 we need to look at this problem and do
>> something. Even if we continue to ship Guava 11 so that the HBase team
>> don't send any (more) death threats, we can/should rework Hadoop to build
>> and run against Guava 16+ too. That's needed to fix some of the recent java
>> 7/8+ changes.
>>
>> -Everything in v11 dropped from v16 MUST  to be implemented with our own
>> versions.
>> -anything tagged as deprecated in 11+ SHOULD be replaced by newer stuff,
>> wherever possible.
>>
>> I think for 2.7+ we should add some new profiles to the POM, for Java 8 and
>> 9 alongside the new baseline java 7. For those later versions we could
>> perhaps mandate Guava 16.
>>
>>
>>
>> On 10 November 2014 00:42, Arun C Murthy  wrote:
>>
>> > … has been a constant pain w.r.t compatibility etc.
>> >
>> > Should we consider adopting a policy to not use guava in
>> Common/HDFS/YARN?
>> >
>> > MR doesn't matter too much since it's application-side issue, it does
>> hurt
>> > end-users though since they still might want a newer guava-version, but
>> at
>> > least they can modify MR.
>> >
>> > Thoughts?
>> >
>> > thanks,
>> > Arun
>> >
>> >
>> > --
>> > CONFIDENTIALITY NOTICE
>> > NOTICE: This message is intended for the use of the individual or entity
>> to
>> > which it is addressed and may contain information that is confidential,
>> > privileged and exempt from disclosure under applicable law. If the reader
>> > of this message is not the intended recipient, you are hereby notified
>> that
>> > any printing, copying, dissemination, distribution, disclosure or
>> > forwarding of this communication is strictly prohibited. If you have
>> > received this communication in error, please contact the sender
>> immediately
>> > and delete it from your system. Thank You.
>> >
>>
>> --
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity to
>> which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>


Re: Guava

2014-11-10 Thread Sangjin Lee
FYI, we have an existing ApplicationClassLoader implementation that is used
to isolate client/task classes from the rest. If we're going down the route
of classloader isolation on this, it would be good to come up with a
coherent strategy regarding both of these.

As a more practical step, I like the idea of isolating usage of guava that
breaks with guava 16 and later. I assume (but I haven't looked into it)
that it's fairly straightforward to isolate them and fix them. That work
could be done at any time without any version upgrades or impacting users.

On Mon, Nov 10, 2014 at 9:26 AM, Alejandro Abdelnur 
wrote:

> IMO we should:
>
> 1* have a clean and thin client API JAR (which does not drag any 3rd party
> dependencies, or a well defined small set -i.e. slf4j & log4j-)
> 2* have a client implementation that uses a classloader to isolate client
> impl 3rd party deps from app dependencies.
>
> #2 can be done using a stock URLClassLoader (i would just subclass it to
> forbid packages in the API JAR and exposed 3rd parties to be loaded from
> the app JAR)
>
> #1 is the tricky thing as our current API modules don't have a clean
> API/impl separation.
>
> thx
> PS: If folks are interested in pursing this, I can put together a prototype
> of how  #2 would work (I don't think it will be more than 200 lines of
> code)
>
>
> On Mon, Nov 10, 2014 at 5:18 AM, Steve Loughran 
> wrote:
>
> > Yes, Guava is a constant pain; there's lots of open JIRAs related to it,
> as
> > its the one we can't seamlessly upgrade. Not unless we do our own fork
> and
> > reinsert the missing classes.
> >
> > The most common uses in the code are
> >
> > @VisibleForTesting (easily replicated)
> > and the Precondition.check() operations
> >
> > The latter is also easily swapped out, and we could even add the check
> they
> > forgot:
> > Preconditions.checkArgNotNull(argname, arg)
> >
> >
> > These are easy; its the more complex data structures that matter more.
> >
> > I think for Hadoop 2.7 & java 7 we need to look at this problem and do
> > something. Even if we continue to ship Guava 11 so that the HBase team
> > don't send any (more) death threats, we can/should rework Hadoop to build
> > and run against Guava 16+ too. That's needed to fix some of the recent
> java
> > 7/8+ changes.
> >
> > -Everything in v11 dropped from v16 MUST  to be implemented with our own
> > versions.
> > -anything tagged as deprecated in 11+ SHOULD be replaced by newer stuff,
> > wherever possible.
> >
> > I think for 2.7+ we should add some new profiles to the POM, for Java 8
> and
> > 9 alongside the new baseline java 7. For those later versions we could
> > perhaps mandate Guava 16.
> >
> >
> >
> > On 10 November 2014 00:42, Arun C Murthy  wrote:
> >
> > > ... has been a constant pain w.r.t compatibility etc.
> > >
> > > Should we consider adopting a policy to not use guava in
> > Common/HDFS/YARN?
> > >
> > > MR doesn't matter too much since it's application-side issue, it does
> > hurt
> > > end-users though since they still might want a newer guava-version, but
> > at
> > > least they can modify MR.
> > >
> > > Thoughts?
> > >
> > > thanks,
> > > Arun
> > >
> > >
> > > --
> > > CONFIDENTIALITY NOTICE
> > > NOTICE: This message is intended for the use of the individual or
> entity
> > to
> > > which it is addressed and may contain information that is confidential,
> > > privileged and exempt from disclosure under applicable law. If the
> reader
> > > of this message is not the intended recipient, you are hereby notified
> > that
> > > any printing, copying, dissemination, distribution, disclosure or
> > > forwarding of this communication is strictly prohibited. If you have
> > > received this communication in error, please contact the sender
> > immediately
> > > and delete it from your system. Thank You.
> > >
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
> >
>


Re: Guava

2014-11-10 Thread Alejandro Abdelnur
IMO we should:

1* have a clean and thin client API JAR (which does not drag any 3rd party
dependencies, or a well defined small set -i.e. slf4j & log4j-)
2* have a client implementation that uses a classloader to isolate client
impl 3rd party deps from app dependencies.

#2 can be done using a stock URLClassLoader (i would just subclass it to
forbid packages in the API JAR and exposed 3rd parties to be loaded from
the app JAR)

#1 is the tricky thing as our current API modules don't have a clean
API/impl separation.

thx
PS: If folks are interested in pursing this, I can put together a prototype
of how  #2 would work (I don't think it will be more than 200 lines of code)


On Mon, Nov 10, 2014 at 5:18 AM, Steve Loughran 
wrote:

> Yes, Guava is a constant pain; there's lots of open JIRAs related to it, as
> its the one we can't seamlessly upgrade. Not unless we do our own fork and
> reinsert the missing classes.
>
> The most common uses in the code are
>
> @VisibleForTesting (easily replicated)
> and the Precondition.check() operations
>
> The latter is also easily swapped out, and we could even add the check they
> forgot:
> Preconditions.checkArgNotNull(argname, arg)
>
>
> These are easy; its the more complex data structures that matter more.
>
> I think for Hadoop 2.7 & java 7 we need to look at this problem and do
> something. Even if we continue to ship Guava 11 so that the HBase team
> don't send any (more) death threats, we can/should rework Hadoop to build
> and run against Guava 16+ too. That's needed to fix some of the recent java
> 7/8+ changes.
>
> -Everything in v11 dropped from v16 MUST  to be implemented with our own
> versions.
> -anything tagged as deprecated in 11+ SHOULD be replaced by newer stuff,
> wherever possible.
>
> I think for 2.7+ we should add some new profiles to the POM, for Java 8 and
> 9 alongside the new baseline java 7. For those later versions we could
> perhaps mandate Guava 16.
>
>
>
> On 10 November 2014 00:42, Arun C Murthy  wrote:
>
> > … has been a constant pain w.r.t compatibility etc.
> >
> > Should we consider adopting a policy to not use guava in
> Common/HDFS/YARN?
> >
> > MR doesn't matter too much since it's application-side issue, it does
> hurt
> > end-users though since they still might want a newer guava-version, but
> at
> > least they can modify MR.
> >
> > Thoughts?
> >
> > thanks,
> > Arun
> >
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
> >
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>


[jira] [Created] (HADOOP-11289) Fix typo in RpcInfo log message

2014-11-10 Thread Charles Lamb (JIRA)
Charles Lamb created HADOOP-11289:
-

 Summary: Fix typo in RpcInfo log message
 Key: HADOOP-11289
 URL: https://issues.apache.org/jira/browse/HADOOP-11289
 Project: Hadoop Common
  Issue Type: Bug
  Components: net
Affects Versions: 2.7.0
Reporter: Charles Lamb
Assignee: Charles Lamb
Priority: Trivial


>From RpcUtil.java:

LOG.info("Malfromed RPC request from " + e.getRemoteAddress());

s/Malfromed/malformed/



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: [jira] [Resolved] (HADOOP-11259) Hadoop /Common directory is missing from all download mirrors I checked

2014-11-10 Thread Juan Pedro Danculovic
unsuscribe

El Tue Nov 04 2014 at 16:20:36, Steve Loughran (JIRA) ()
escribió:

>
>  [ https://issues.apache.org/jira/browse/HADOOP-11259?page=
> com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
>
> Steve Loughran resolved HADOOP-11259.
> -
> Resolution: Fixed
>
> > Hadoop /Common  directory is missing from all download mirrors I checked
> > 
> >
> > Key: HADOOP-11259
> > URL: https://issues.apache.org/jira/browse/HADOOP-11259
> > Project: Hadoop Common
> >  Issue Type: Bug
> >Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.5.0, 2.4.1, 2.5.1
> >Reporter: Nick Kanellos
> >Assignee: Steve Loughran
> >Priority: Blocker
> >
> > I checked several download mirrors.  They all seem to be missing the
> /common folder. The only thing I see there is .../dist/hadoop/chukwa/.
> This is a blocker since I can't download Hadoop at all.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.3.4#6332)
>


[jira] [Resolved] (HADOOP-11288) yarn.resourcemanager.scheduler.class wrongly set in yarn-default.xml documentation

2014-11-10 Thread Jason Lowe (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11288?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jason Lowe resolved HADOOP-11288.
-
Resolution: Invalid

The CapacityScheduler is very much supported, and is actively being developed.  
It's setting as the default scheduler is intentional, see YARN-137.

> yarn.resourcemanager.scheduler.class wrongly set in yarn-default.xml 
> documentation
> --
>
> Key: HADOOP-11288
> URL: https://issues.apache.org/jira/browse/HADOOP-11288
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 2.5.0
>Reporter: DeepakVohra
>
> The yarn.resourcemanager.scheduler.class property is wrongly set to 
> org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler.
>  CapacitySchduler is not even supported. Should be 
> org.apache.hadoop.yarn.server.resourcemanager.scheduler.fair.FairScheduler. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11288) yarn.resourcemanager.scheduler.class wrongly set in yarn-default.xml documentation

2014-11-10 Thread DeepakVohra (JIRA)
DeepakVohra created HADOOP-11288:


 Summary: yarn.resourcemanager.scheduler.class wrongly set in 
yarn-default.xml documentation
 Key: HADOOP-11288
 URL: https://issues.apache.org/jira/browse/HADOOP-11288
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.5.0
Reporter: DeepakVohra


The yarn.resourcemanager.scheduler.class property is wrongly set to 
org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler.
 CapacitySchduler is not even supported. Should be 
org.apache.hadoop.yarn.server.resourcemanager.scheduler.fair.FairScheduler. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Re: Guava

2014-11-10 Thread Steve Loughran
Yes, Guava is a constant pain; there's lots of open JIRAs related to it, as
its the one we can't seamlessly upgrade. Not unless we do our own fork and
reinsert the missing classes.

The most common uses in the code are

@VisibleForTesting (easily replicated)
and the Precondition.check() operations

The latter is also easily swapped out, and we could even add the check they
forgot:
Preconditions.checkArgNotNull(argname, arg)


These are easy; its the more complex data structures that matter more.

I think for Hadoop 2.7 & java 7 we need to look at this problem and do
something. Even if we continue to ship Guava 11 so that the HBase team
don't send any (more) death threats, we can/should rework Hadoop to build
and run against Guava 16+ too. That's needed to fix some of the recent java
7/8+ changes.

-Everything in v11 dropped from v16 MUST  to be implemented with our own
versions.
-anything tagged as deprecated in 11+ SHOULD be replaced by newer stuff,
wherever possible.

I think for 2.7+ we should add some new profiles to the POM, for Java 8 and
9 alongside the new baseline java 7. For those later versions we could
perhaps mandate Guava 16.



On 10 November 2014 00:42, Arun C Murthy  wrote:

> … has been a constant pain w.r.t compatibility etc.
>
> Should we consider adopting a policy to not use guava in Common/HDFS/YARN?
>
> MR doesn't matter too much since it's application-side issue, it does hurt
> end-users though since they still might want a newer guava-version, but at
> least they can modify MR.
>
> Thoughts?
>
> thanks,
> Arun
>
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.