[ 
https://issues.apache.org/jira/browse/HDFS-14394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16808172#comment-16808172
 ] 

Todd Lipcon commented on HDFS-14394:
------------------------------------

I'm pretty certain that depending on GNU extensions to the C language does not 
imply anything about licensing of your code. See the Apache license faq here: 
https://apache.org/legal/resolved.html#prohibited

bq. For example, using a GPL'ed tool during the build is OK, however including 
GPL'ed source code is not.

Note that if language/toolchain licensing was viral to projects built in that 
language, all of our use of Java would be problematic as well :)

Note also that clang supports the --std=gnu99 mode, so even turning this on 
doesn't imply usage of gcc or any other GPL tool. Generally I think the GNU99 
standard is very well supported. As of gcc 5.1 in fact the default is 
--std=gnu11. It seems like icc is also fine with this as of icc 17.0.0.

I don't see any reason to add --pedantic-errors in this patch either -- this is 
just fixing an issue compiling on older compilers to match the standard we are 
_already using_ on newer compilers. If we want to be stricter about our C 
standard adherence, let's do that separately.

On the subject of passing this flag to the C++ code, I agree that it won't have 
any effect, because the CMAKE_C_FLAGS should not affect the C++ code. 

> Add -std=c99 / -std=gnu99 to libhdfs compile flags
> --------------------------------------------------
>
>                 Key: HDFS-14394
>                 URL: https://issues.apache.org/jira/browse/HDFS-14394
>             Project: Hadoop HDFS
>          Issue Type: Task
>          Components: hdfs-client, libhdfs, native
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HDFS-14394.001.patch
>
>
> libhdfs compilation currently does not enforce a minimum required C version. 
> As of today, the libhdfs build on Hadoop QA works, but when built on a 
> machine with an outdated gcc / cc version where C89 is the default, 
> compilation fails due to errors such as:
> {code}
> /build/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/jclasses.c:106:5:
>  error: ‘for’ loop initial declarations are only allowed in C99 mode
> for (int i = 0; i < numCachedClasses; i++) {
> ^
> /build/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/jclasses.c:106:5:
>  note: use option -std=c99 or -std=gnu99 to compile your code
> {code}
> We should add the -std=c99 / -std=gnu99 flags to libhdfs compilation so that 
> we can enforce C99 as the minimum required version.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to