[ 
https://issues.apache.org/jira/browse/HADOOP-16311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16888027#comment-16888027
 ] 

AiBe Gee edited comment on HADOOP-16311 at 7/18/19 8:24 PM:
------------------------------------------------------------

[~tonyharvey]

Sorry for the late reply, haven't checked the E-Mail I used for my Jira 
registration until today.

I put this on resolved because I learned that it was a duplicate:
 https://issues.apache.org/jira/browse/YARN-8498
 and not because I managed to resolve it.

Unfortunately, the patches presented in:
 https://issues.apache.org/jira/browse/YARN-8498
 are not working and I went for Hadoop 2.9.2, which I was able to build 
successfully on Pi2B.
 Same for HBase - I picked 1.4.9, Hive - 2.3.5 and Phoenix 4.14.2

The latest versions for the packages I presented above don't seem to work on 
ARM, the maven build scripts are downloading some X86 stuff and the builds are 
failing. See:
 https://issues.apache.org/jira/browse/HADOOP-16309

Off-Topic,  just to help you, these are my notes for the hadoop 2.9.2 build on 
Raspberry Pi2 ARMv7 using Slackware Linux 14.2:
 - protobuf 2.5.0 required !
 wget [https://github.com/apache/hadoop/archive/rel/release-2.9.2.tar.gz]
tar -xzpf release-2.9.2.tar.gz
cd hadoop-rel-release-2.9.2/
 - swap - using external HDD
swapoff /dev/whatever-partition-is-actually-the-swap
mkswap /dev/sda1
swapon /dev/sda1
 echo 1 > /proc/sys/vm/swappiness
 - Environment:
export PATH="$PATH:/opt/java/bin"
export M2_HOME=/opt/apache-maven-3.6.1
export "PATH=$PATH:$M2_HOME/bin"
JAVA_HOME=/opt/java
export JAVA_HOME
export ARCH=arm
export CFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export CXXFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export CPPFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export MAKEFLAGS="-j 3"
 - need to patch pom.xml - add to pom.xml:  
 <plugin>      
   <groupId>org.apache.maven.plugins</groupId>   
      <artifactId>maven-surefire-plugin</artifactId>   
      <version>3.0.0-M3</version>  
       <configuration>   
        <useSystemClassLoader>false</useSystemClassLoader>  
       </configuration>    
   </plugin>

 - needs this too - only on ARM:
 https://issues.apache.org/jira/browse/HADOOP-9320- patch - v2.8.patch
 cd /kit/hadoop-rel-release-2.9.2/hadoop-common-project/hadoop-common/
 wget https://patch-diff.githubusercontent.com/raw/apache/hadoop/pull/224.patch
 patch < 224.patch
 - this patch too:
 https://issues.apache.org/jira/browse/HADOOP-14597
 cd 
/kit/hadoop-rel-release-2.9.2/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/
 wget [^HADOOP-14597.04.patch]
 patch < HADOOP-14597.04.patch
 cd 
/kit/hadoop-rel-release-2.9.2/hadoop-tools/hadoop-pipes/src/main/native/pipes/impl/
 wget [^HADOOP-14597.04.patch]
 patch < HADOOP-14597.04.patch

Build:
 cd /kit/hadoop-rel-release-2.9.2/
 nohup mvn package -Pdist,native,docs -DskipTests -Dtar 2>&1 | tee 
hadoop-2-9-2-build.log
 cp /kit/hadoop-rel-release-2.9.2/hadoop-dist/target/hadoop-2.9.2.tar.gz /kit/

Hope it helps.

P.S. Edit - still profoundly horrified/disgusted over how jira works - 
autoformatting, worse than Redmond Word! I have edited my post several times 
and there are still some links to some patches broken, sorry, I lost patience 
correcting all the automated crap.
A pity using this impossible tool for such a great project like hadoop ...


was (Author: abga):
[~tonyharvey]

Sorry for the late reply, haven't checked the E-Mail I used for my Jira 
registration until today.

I put this on resolved because I learned that it was a duplicate:
 https://issues.apache.org/jira/browse/YARN-8498
 and not because I managed to resolve it.

Unfortunately, the patches presented in:
 https://issues.apache.org/jira/browse/YARN-8498
 are not working and I went for Hadoop 2.9.2, which I was able to build 
successfully on Pi2B.
 Same for HBase - I picked 1.4.9, Hive - 2.3.5 and Phoenix 4.14.2

The latest versions for the packages I presented above don't seem to work on 
ARM, the maven build scripts are downloading some X86 stuff and the builds are 
failing. See:
 https://issues.apache.org/jira/browse/HADOOP-16309

Off-Topic,  just to help you, these are my notes for the hadoop 2.9.2 build on 
Raspberry Pi2 ARMv7 using Slackware Linux 14.2:
 - protobuf 2.5.0 required !
 wget [https://github.com/apache/hadoop/archive/rel/release-2.9.2.tar.gz]
tar -xzpf release-2.9.2.tar.gz
cd hadoop-rel-release-2.9.2/
 - swap - using external HDD
swapoff /dev/whatever-partition-is-actually-the-swap
mkswap /dev/sda1
swapon /dev/sda1
 echo 1 > /proc/sys/vm/swappiness
 - Environment:
 export PATH="$PATH:/opt/java/bin"export M2_HOME=/opt/apache-maven-3.6.1
export "PATH=$PATH:$M2_HOME/bin"JAVA_HOME=/opt/java
export JAVA_HOME
export ARCH=arm
export CFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export CXXFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export CPPFLAGS="-march=armv7-a -mtune=cortex-a7 -mfpu=neon-vfpv4 
-mvectorize-with-neon-quad -mfloat-abi=hard"
export MAKEFLAGS="-j 3"
 - need to patch pom.xml - add to pom.xml:  
 <plugin>      
   <groupId>org.apache.maven.plugins</groupId>   
      <artifactId>maven-surefire-plugin</artifactId>   
      <version>3.0.0-M3</version>  
       <configuration>   
        <useSystemClassLoader>false</useSystemClassLoader>  
       </configuration>    
   </plugin>

 - needs this too - only on ARM:
 https://issues.apache.org/jira/browse/HADOOP-9320- patch - v2.8.patch
 cd /kit/hadoop-rel-release-2.9.2/hadoop-common-project/hadoop-common/
 wget https://patch-diff.githubusercontent.com/raw/apache/hadoop/pull/224.patch
 patch < 224.patch
 - this patch too:
 https://issues.apache.org/jira/browse/HADOOP-14597
 cd 
/kit/hadoop-rel-release-2.9.2/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/
 wget [^HADOOP-14597.04.patch]
 patch < HADOOP-14597.04.patch
 cd 
/kit/hadoop-rel-release-2.9.2/hadoop-tools/hadoop-pipes/src/main/native/pipes/impl/
 wget [^HADOOP-14597.04.patch]
 patch < HADOOP-14597.04.patch

Build:
 cd /kit/hadoop-rel-release-2.9.2/
 nohup mvn package -Pdist,native,docs -DskipTests -Dtar 2>&1 | tee 
hadoop-2-9-2-build.log
 cp /kit/hadoop-rel-release-2.9.2/hadoop-dist/target/hadoop-2.9.2.tar.gz /kit/

Hope it helps.

P.S. Edit - still profoundly horrified/disgusted over how jira works - 
autoformatting, worse than Redmond Word! I have edited my post several times 
and there are still some links to some patches broken, sorry, I lost patience 
correcting all the automated crap.
A pity using this impossible tool for such a great project like hadoop ...

> Hadoop build failure - natively on ARM (armv7) - oom_listener_main.c issues
> ---------------------------------------------------------------------------
>
>                 Key: HADOOP-16311
>                 URL: https://issues.apache.org/jira/browse/HADOOP-16311
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 3.2.0
>         Environment: Linux ARM (armv7)
>            Reporter: AiBe Gee
>            Priority: Major
>
> After failing to build Hadoop 3.3.0 - latest git snapshot, described in:
> https://issues.apache.org/jira/browse/HADOOP-16309
> I went on and tried my luck with the 3.2.0 release, only to find out that I'm 
> unable to build this version too. Environment is the same as described in:
> https://issues.apache.org/jira/browse/HADOOP-16309
> Just the source archive is different:
> https://archive.apache.org/dist/hadoop/common/hadoop-3.2.0/hadoop-3.2.0-src.tar.gz
> Hadoop build failure log snippet:
> [WARNING] make[2]: Leaving directory 
> '/kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native'
> [WARNING] [ 44%] Built target container
> [WARNING] [ 47%] Linking CXX static library libgtest.a
> [WARNING] /usr/bin/cmake -P CMakeFiles/gtest.dir/cmake_clean_target.cmake
> [WARNING] /usr/bin/cmake -E cmake_link_script CMakeFiles/gtest.dir/link.txt 
> --verbose=1
> [WARNING] /usr/bin/ar qc libgtest.a  
> CMakeFiles/gtest.dir/kit/hadoop-3.2.0-src/hadoop-common-project/hadoop-common/src/main/native/gtest/gtest-all.cc.o
> [WARNING] /usr/bin/ranlib libgtest.a
> [WARNING] make[2]: Leaving directory 
> '/kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native'
> [WARNING] [ 47%] Built target gtest
> [WARNING] make[1]: Leaving directory 
> '/kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native'
> [WARNING] /usr/bin/ld: 
> CMakeFiles/oom-listener.dir/main/native/oom-listener/impl/oom_listener_main.c.o:
>  in function `main':
> *[WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/impl/oom_listener_main.c:89:
>  undefined reference to `cleanup'*
> [WARNING] collect2: error: ld returned 1 exit status
> [WARNING] make[2]: *** [CMakeFiles/oom-listener.dir/build.make:99: 
> target/usr/local/bin/oom-listener] Error 1
> [WARNING] make[1]: *** [CMakeFiles/Makefile2:222: 
> CMakeFiles/oom-listener.dir/all] Error 2
> [WARNING] make[1]: *** Waiting for unfinished jobs....
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/modules/gpu/gpu-module.c:
>  In function ‘handle_gpu_request’:
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/modules/gpu/gpu-module.c:196:9:
>  warning: ‘strncpy’ specified bound 128 equals destination size 
> [-Wstringop-truncation]
> [WARNING]   196 |         strncpy(container_id, optarg, MAX_CONTAINER_ID_LEN);
> [WARNING]       |         ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/modules/fpga/fpga-module.c:
>  In function ‘handle_fpga_request’:
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/modules/fpga/fpga-module.c:196:9:
>  warning: ‘strncpy’ specified bound 128 equals destination size 
> [-Wstringop-truncation]
> [WARNING]   196 |         strncpy(container_id, optarg, MAX_CONTAINER_ID_LEN);
> [WARNING]       |         ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:
>  In function ‘normalize_mount’:
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:1041:9:
>  warning: ‘strncpy’ output truncated before terminating nul copying as many 
> bytes from a string as its length [-Wstringop-truncation]
> [WARNING]  1041 |         strncpy(ret_ptr, real_mount, len);
> [WARNING]       |         ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:1034:20:
>  note: length computed here
> [WARNING]  1034 |       size_t len = strlen(real_mount);
> [WARNING]       |                    ^~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:
>  In function ‘check_trusted_image’:
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:128:9:
>  warning: ‘strncpy’ output truncated before terminating nul copying as many 
> bytes from a string as its length [-Wstringop-truncation]
> [WARNING]   128 |         strncpy(registry_ptr, privileged_registry[i], len);
> [WARNING]       |         ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/utils/docker-util.c:125:17:
>  note: length computed here
> [WARNING]   125 |       int len = strlen(privileged_registry[i]);
> [WARNING]       |                 ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:
>  In function ‘mount_cgroup’:
> [WARNING] 
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:2445:19:
>  warning: ‘stpncpy’ output truncated before terminating nul copying as many 
> bytes from a string as its length [-Wstringop-truncation]
> [WARNING]  2445 |       char *buf = stpncpy(hier_path, mount_path, 
> strlen(mount_path));
> [WARNING]       |                   
> ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> [WARNING] make: *** [Makefile:84: all] Error 2
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary for Apache Hadoop Main 3.2.0:
> [INFO]
> [INFO] Apache Hadoop Main ................................. SUCCESS [  9.998 
> s]
> ...
> [INFO] Apache Hadoop YARN Server Common ................... SUCCESS [03:56 
> min]
> [INFO] Apache Hadoop YARN NodeManager ..................... FAILURE [04:08 
> min]
> [INFO] Apache Hadoop YARN Web Proxy ....................... SKIPPED
> ...
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time:  03:03 h
> [INFO] Finished at: 2019-05-13T05:06:13+03:00
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> org.apache.hadoop:hadoop-maven-plugins:3.2.0:cmake-compile (cmake-compile) on 
> project hadoop-yarn-server-nodemanager: make failed with error code 2 -> 
> [Help 1]
> Did some investigation, went for a manual compilation of the affected source 
> tree:
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native#
>  make -j 4 V=1
> Scanning dependencies of target oom-listener
> [  5%] Built target gtest
> [ 38%] Built target container
> [ 41%] Building C object 
> CMakeFiles/oom-listener.dir/main/native/oom-listener/impl/oom_listener_main.c.o
> [ 44%] Building CXX object 
> CMakeFiles/test-oom-listener.dir/main/native/oom-listener/test/oom_listener_test_main.cc.o
> [ 50%] Built target container-executor
> [ 52%] Building CXX object 
> CMakeFiles/cetest.dir/main/native/container-executor/test/utils/test_docker_util.cc.o
> [ 58%] Built target test-container-executor
> [ 61%] Linking C executable target/usr/local/bin/oom-listener
> /usr/bin/ld: 
> CMakeFiles/oom-listener.dir/main/native/oom-listener/impl/oom_listener_main.c.o:
>  in function `main':
> */kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/impl/oom_listener_main.c:89:
>  undefined reference to `cleanup'*
> collect2: error: ld returned 1 exit status
> make[2]: *** [CMakeFiles/oom-listener.dir/build.make:99: 
> target/usr/local/bin/oom-listener] Error 1
> make[1]: *** [CMakeFiles/Makefile2:222: CMakeFiles/oom-listener.dir/all] 
> Error 2
> make[1]: *** Waiting for unfinished jobs....
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:
>  In member function ‘virtual void OOMListenerTest_test_oom_Test::TestBody()’:
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:256:7:
>  error: ‘__WAIT_STATUS’ was not declared in this scope; did you mean 
> ‘ADJ_STATUS’?
>   256 |       __WAIT_STATUS mem_hog_status = {};
>       |       ^~~~~~~~~~~~~
>       |       ADJ_STATUS
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:257:30:
>  error: ‘mem_hog_status’ was not declared in this scope
>   257 |       __pid_t exited0 = wait(mem_hog_status);
>       |                              ^~~~~~~~~~~~~~
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:275:20:
>  error: expected ‘;’ before ‘oom_listener_status’
>   275 |       __WAIT_STATUS oom_listener_status = {};
>       |                    ^~~~~~~~~~~~~~~~~~~~
>       |                    ;
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:276:30:
>  error: ‘oom_listener_status’ was not declared in this scope; did you mean 
> ‘oom_listener’?
>   276 |       __pid_t exited1 = wait(oom_listener_status);
>       |                              ^~~~~~~~~~~~~~~~~~~
>       |                              oom_listener
> make[2]: *** [CMakeFiles/test-oom-listener.dir/build.make:76: 
> CMakeFiles/test-oom-listener.dir/main/native/oom-listener/test/oom_listener_test_main.cc.o]
>  Error 1
> make[1]: *** [CMakeFiles/Makefile2:73: CMakeFiles/test-oom-listener.dir/all] 
> Error 2
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/test/utils/test_docker_util.cc:
>  In member function ‘char* ContainerExecutor::TestDockerUtil::flatten(args*)’:
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/test/utils/test_docker_util.cc:107:20
>   warning: ‘char* strncpy(char*, const char*, size_t)’ output truncated 
> before terminating nul copying 1 byte from a string of the same length 
> [-Wstringop-truncation]
>   107 |             strncpy(buffer + current_len, " ", 1);
>       |             ~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> make[2]: *** wait: No child processes.  Stop.
> make[2]: *** Waiting for unfinished jobs....
> Then at line 89 in
> hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/impl/oom_listener_main.c
> - commented:
> /*  cleanup(&descriptors); */
> Tried a manual compilation again and got into more errors:
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/native#
>  make -j 4 V=1
> Scanning dependencies of target oom-listener
> [  5%] Built target gtest
> [  8%] Building C object 
> CMakeFiles/oom-listener.dir/main/native/oom-listener/impl/oom_listener_main.c.o
> [ 41%] Built target container
> [ 44%] Building CXX object 
> CMakeFiles/test-oom-listener.dir/main/native/oom-listener/test/oom_listener_test_main.cc.o
> [ 50%] Built target container-executor
> [ 52%] Building CXX object 
> CMakeFiles/cetest.dir/main/native/container-executor/test/utils/test_docker_util.cc.o
> [ 58%] Built target test-container-executor
> [ 61%] Linking C executable target/usr/local/bin/oom-listener
> [ 63%] Built target oom-listener
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:
>  In member function ‘virtual void OOMListenerTest_test_oom_Test::TestBody()’:
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:256:7:
>  error: ‘__WAIT_STATUS’ was not declared in this scope; did you mean 
> ‘ADJ_STATUS’?
>   256 |       __WAIT_STATUS mem_hog_status = {};
>       |       ^~~~~~~~~~~~~
>       |       ADJ_STATUS
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:257:30:
>  error: ‘mem_hog_status’ was not declared in this scope
>   257 |       __pid_t exited0 = wait(mem_hog_status);
>       |                              ^~~~~~~~~~~~~~
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:275:20:
>  error: expected ‘;’ before ‘oom_listener_status’
>   275 |       __WAIT_STATUS oom_listener_status = {};
>       |                    ^~~~~~~~~~~~~~~~~~~~
>       |                    ;
> /kit/hadoop-3.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/oom-listener/test/oom_listener_test_main.cc:276:30:
>  error: ‘oom_listener_status’ was not declared in this scope; did you mean 
> ‘oom_listener’?
>   276 |       __pid_t exited1 = wait(oom_listener_status);
>       |                              ^~~~~~~~~~~~~~~~~~~
>       |                              oom_listener
> make[2]: *** [CMakeFiles/test-oom-listener.dir/build.make:76: 
> CMakeFiles/test-oom-listener.dir/main/native/oom-listener/test/oom_listener_test_main.cc.o]
>  Error 1
> make[1]: *** [CMakeFiles/Makefile2:73: CMakeFiles/test-oom-listener.dir/all] 
> Error 2
> make[1]: *** Waiting for unfinished jobs....



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to