294
votes

I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I'm running Hadoop 2.2.0.

Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

However, the contents of /native/ directory on hadoop 2.x appear to be different so I am not sure what to do.

I've also added these two environment variables in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any ideas?

23
For searchability: this problem also applies at least to Hadoop 2.4.0, Hadoop 2.4.1 and probably other versions.Greg Dubicki
Documentation for how to use native libraries is at hadoop.apache.org/docs/current/hadoop-project-dist/…James Moore

23 Answers

246
votes

I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library $HADOOP_HOME/lib/native/libhadoop.so.1.0.0 was actually compiled on 32 bit.

Anyway, it's just a warning, and won't impact Hadoop's functionalities.

Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0 on 64bit system, then replace the 32bit one.

Steps on how to recompile source code are included here for Ubuntu:

Good luck.

164
votes

Just append word native to your HADOOP_OPTS like this:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

PS: Thank Searene

57
votes

The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:

/opt/hadoop/lib/native/libhadoop.so.1.0.0

And I know it is 64-bit:

[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 =>  (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)

Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":

`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)

So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable

So, off to here to see what it does:

http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/

Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)

And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)

What version of GLIBC do I have? Here's simple trick to find out:

[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12

So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

30
votes

I had the same issue. It's solved by adding following lines in .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
24
votes

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

17
votes

This also would work:

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
14
votes

After a continuous research as suggested by KotiI got resolved the issue.

hduser@ubuntu:~$ cd /usr/local/hadoop

hduser@ubuntu:/usr/local/hadoop$ ls

bin  include  libexec      logs        README.txt  share
etc  lib      LICENSE.txt  NOTICE.txt  sbin

hduser@ubuntu:/usr/local/hadoop$ cd lib

hduser@ubuntu:/usr/local/hadoop/lib$ ls
native

hduser@ubuntu:/usr/local/hadoop/lib$ cd native/

hduser@ubuntu:/usr/local/hadoop/lib/native$ ls

libhadoop.a       libhadoop.so        libhadooputils.a  libhdfs.so
libhadooppipes.a  libhadoop.so.1.0.0  libhdfs.a         libhdfs.so.0.0.0

hduser@ubuntu:/usr/local/hadoop/lib/native$ sudo mv * ../

Cheers

14
votes
export JAVA_HOME=/home/hadoop/software/java/jdk1.7.0_80
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
13
votes

For those on OSX with Hadoop installed via Homebrew, follow these steps replacing the path and Hadoop version where appropriate

wget http://www.eu.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1-src.tar.gz
tar xvf hadoop-2.7.1-src.tar.gz
cd hadoop-2.7.1-src
mvn package -Pdist,native -DskipTests -Dtar
mv lib /usr/local/Cellar/hadoop/2.7.1/

then update hadoop-env.sh with

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc= -Djava.library.path=/usr/local/Cellar/hadoop/2.7.1/lib/native"
9
votes

@zhutoulala -- FWIW your links worked for me with Hadoop 2.4.0 with one exception I had to tell maven not to build the javadocs. I also used the patch in the first link for 2.4.0 and it worked fine. Here's the maven command I had to issue

mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar

After building this and moving the libraries, don't forget to update hadoop-env.sh :)

Thought this might help someone who ran into the same roadblocks as me

6
votes

Move your compiled native library files to $HADOOP_HOME/lib folder.

Then set your environment variables by editing .bashrc file

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"

Make sure your compiled native library files are in $HADOOP_HOME/lib folder.

it should work.

3
votes
export HADOOP_HOME=/home/hadoop/hadoop-2.4.1  
export PATH=$HADOOP_HOME/bin:$PATH  
export HADOOP_PREFIX=$HADOOP_HOME  
export HADOOP_COMMON_HOME=$HADOOP_PREFIX  
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_PREFIX/lib/native  
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop  
export HADOOP_HDFS_HOME=$HADOOP_PREFIX  
export HADOOP_MAPRED_HOME=$HADOOP_PREFIX  
export HADOOP_YARN_HOME=$HADOOP_PREFIX  
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
3
votes

This line right here:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

From KunBetter's answer, worked for me. Just append it to .bashrc file and reload .bashrc contents

$ source ~/.bashrc
2
votes

This line right here:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

From KunBetter's answer is where the money is

2
votes

I had the same problem with JDK6,I changed the JDK to JDK8,the problem solved. Try to use JDK8!!!

2
votes

In addition to @zhutoulala accepted answer, here is an update to make it work with latest stable version to date (2.8) on ARMHF platforms (Raspberry Pi 3 model B). First I can confirm that you must recompile native libraries to 64 bit ARM, other answers here based on setting some environment variables won't work. As indicated in Hadoop documentation, the pre-built native libraries are 32 bit.

High level steps given in the fist link (http://www.ercoppa.org/posts/how-to-compile-apache-hadoop-on-ubuntu-linux.html) are correct. On this url http://www.instructables.com/id/Native-Hadoop-260-Build-on-Pi/ you get more details specific to Raspberry Pi, but not for Hadoop version 2.8.

Here are my indications pour Hadoop 2.8 :

  • there is still no protobuf package on latest Raspbian so you must compile it yourself and version must be exactly protobuf 2.5 (https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz)
  • CMake file patching method must be changed. Moreovere, files to patch are not the same. Unfortunately, there is no accepted patch on JIRA specific to 2.8. On this URL (https://issues.apache.org/jira/browse/HADOOP-9320) you must copy and paste Andreas Muttscheller proposed patch on your namenode :

    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ touch HADOOP-9320-v2.8.patch
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ vim HADOOP-9320-v2.8.patch
    #copy and paste proposed patch given here : https://issues.apache.org/jira/browse/HADOOP-9320?focusedCommentId=16018862&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16018862
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ patch < HADOOP-9320-v2.8.patch
    patching file HadoopCommon.cmake
    patching file HadoopJNI.cmake
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ cd ../..
    :hadoop-2.8.0-src $ sudo mvn package -Pdist,native -DskipTests -Dtar
    

Once build is successful :

    :hadoop-2.8.0-src/hadoop-dist/target/hadoop-2.8.0/lib/native $ tar -cvf nativelibs.tar *

And replace the content of the lib/native directory of your Hadoop install with the content of this archive. Warning message when running Hadoop should disappear.

1
votes

I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:

# JAVA env
#
export JAVA_HOME=/j01/sys/jdk
export JRE_HOME=/j01/sys/jdk/jre

export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.

# HADOOP env
#
export HADOOP_HOME=/j01/srv/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path

I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:

sudo apt install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user@localhost

Replace user with your username

1
votes

Basically, it is not an error, it's a warning in the Hadoop cluster. Here just we update the environment variables.

export HADOOP_OPTS = "$HADOOP_OPTS"-Djava.library.path = /usr/local/hadoop/lib
 export HADOOP_COMMON_LIB_NATIVE_DIR = "/usr/local/hadoop/lib/native"
0
votes

Verified remedy from earlier postings:

1) Checked that the libhadoop.so.1.0.0 shipped with the Hadoop distribution was compiled for my machine architecture, which is x86_64:

[nova]:file /opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0
/opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=3a80422c78d708c9a1666c1a8edd23676ed77dbb, not stripped

2) Added -Djava.library.path=<path> to HADOOP_OPT in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.library.path=/opt/hadoop-2.6.0/lib/native"

This indeed made the annoying warning disappear.

0
votes

Firstly: You can modify the glibc version.CentOS provides safe softwares tranditionally,it also means the version is old such as glibc,protobuf ...

ldd --version
ldd /opt/hadoop/lib/native/libhadoop.so.1.0.0

You can compare the version of current glibc with needed glibc.

Secondly: If the version of current glibc is old,you can update the glibc. DownLoad Glibc

If the version of current glibc id right,you can append word native to your HADOOP_OPTS

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
0
votes

The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.

Refs: https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

If you are using Windows or Mac OS X, you need to change your platform to *nix.

0
votes

This answer is a mix between @chromeeagle's analysis and this link (Nan-Xiao).

For those who the other solutions simply won't work, please follow these steps:

  1. Edit the file $HADOOP_HOME/etc/hadoop/log4j.properties (credits to @chromeeagle). Add the line at the end:

    log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

  2. Launch your spark/pyspark shell. You will see additional log information regarding the native library not loading. In my case I had the folling error:

    Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path

  3. To fix this specific problem, add the Hadoop native library path to the LD_LIBRARY_PATH environment variable in your user's profile:

    export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH"

Hope this helps. I had this issue in a couple of HADOOP installations, it worked on both.

-1
votes

For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.

http://www.cloudera.com/content/support/en/downloads.html

Steps are: 1) Download 2) Run it 3) Go to web GUI (1.2.3.4:7180) 4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you) 5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.