This is an automated email from the ASF dual-hosted git repository.

hexiaoqiao pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/hadoop.git


The following commit(s) were added to refs/heads/trunk by this push:
     new 809ae58e711c HADOOP-18982. Fix doc about loading native libraries. 
(#6281). Contributed by Shuyan Zhang.
809ae58e711c is described below

commit 809ae58e711c23ea9d6d7fc43b925c09328c52fe
Author: zhangshuyan <81411509+zhangshuy...@users.noreply.github.com>
AuthorDate: Wed Dec 6 21:24:14 2023 +0800

    HADOOP-18982. Fix doc about loading native libraries. (#6281). Contributed 
by Shuyan Zhang.
    
    Signed-off-by: He Xiaoqiao <hexiaoq...@apache.org>
---
 .../hadoop-common/src/site/markdown/NativeLibraries.md.vm         | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git 
a/hadoop-common-project/hadoop-common/src/site/markdown/NativeLibraries.md.vm 
b/hadoop-common-project/hadoop-common/src/site/markdown/NativeLibraries.md.vm
index 1e62e94394f9..9756c42340da 100644
--- 
a/hadoop-common-project/hadoop-common/src/site/markdown/NativeLibraries.md.vm
+++ 
b/hadoop-common-project/hadoop-common/src/site/markdown/NativeLibraries.md.vm
@@ -126,10 +126,10 @@ Native Shared Libraries
 
 You can load any native shared library using DistributedCache for distributing 
and symlinking the library files.
 
-This example shows you how to distribute a shared library, mylib.so, and load 
it from a MapReduce task.
+This example shows you how to distribute a shared library in Unix-like 
systems, mylib.so, and load it from a MapReduce task.
 
-1.  First copy the library to the HDFS: `bin/hadoop fs -copyFromLocal 
mylib.so.1 /libraries/mylib.so.1`
-2.  The job launching program should contain the following: 
`DistributedCache.createSymlink(conf);` 
`DistributedCache.addCacheFile("hdfs://host:port/libraries/mylib.so. 
1#mylib.so", conf);`
-3.  The MapReduce task can contain: `System.loadLibrary("mylib.so");`
+1.  First copy the library to the HDFS: `bin/hadoop fs -copyFromLocal 
libmyexample.so.1 /libraries/libmyexample.so.1`
+2.  The job launching program should contain the following: 
`DistributedCache.createSymlink(conf);` 
`DistributedCache.addCacheFile("hdfs://host:port/libraries/libmyexample.so.1#libmyexample.so",
 conf);`
+3.  The MapReduce task can contain: `System.loadLibrary("myexample");`
 
 Note: If you downloaded or built the native hadoop library, you don’t need to 
use DistibutedCache to make the library available to your MapReduce tasks.


---------------------------------------------------------------------
To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-commits-h...@hadoop.apache.org

Reply via email to