YARN加载本地库抛出Unable to load native-hadoop library解决办法
用官方的Hadoop 2.1.0-beta安装后,每次hadoop命令进去都会抛出这样一个Warning
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
设置logger级别,看下具体原因
export HADOOP_ROOT_LOGGER=DEBUG,console
...
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it. // :: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
report: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "VM_160_34_centos/127.0.0.1"; destination host is: "Master":;
wrong ELFCLASS32,难道是加载的so文件系统版本不对
执行命令
file libhadoop.so.1.0.
hadoop@VM_160_34_centos:/usr/local/hadoop-2.4./lib/native> file libhadoop.so.1.0.
libhadoop.so.1.0.: ELF -bit LSB shared object, Intel , version (SYSV), dynamically linked, not stripped
果然是80386,是32位的系统版本,而我的hadoop环境是64位OS
原来直接从apache镜像中下载的编译好的Hadoop版本native library都是32版本的,如果要支持64位版本,必须自己重新编译,这就有点坑爹了,要知道几乎所有的生产环境都是64位的OS
YARN官方对于native library的一段话验证了这一点
“The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory”
解决方法:重新编译hadoop
解决方法,就是重新编译hadoop软件:
安装开发环境
1.必要的包
yum install svn yum install autoconfautomake libtool cmake yum install ncurses-devel yum install openssl-devel yum install gcc*
2.安装maven
下载,并解压
wget -c http://mirrors.hust.edu.cn/apache/maven/maven-3/3.2.3/binaries/apache-maven-3.2.3-bin.tar.gz
tar -zxvf apache-maven-3.2.-bin.tar.gz -C /usr/local/
将/usr/local/apache-maven-3.2.3/bin加到环境变量中
root@VM_160_34_centos:~/tools> vi /etc/profile.d/maven-development.sh
export M2_HOME=/usr/local/apache-maven-3.2.
export PATH=$PATH:$M2_HOME/bin
root@VM_160_34_centos:~/tools> source /etc/profile
测试 maven
root@VM_160_34_centos:/usr/local/apache-maven-3.2.> mvn -version
Apache Maven 3.2. (33f8c3e1027c3ddde99d3cdebad2656a31e8fdf4; --12T04::+:)
Maven home: /usr/local/apache-maven-3.2.
Java version: 1.7.0_55, vendor: Oracle Corporation
Java home: /usr/local/java/jdk1..0_55/jre
Default locale: en_US, platform encoding: ANSI_X3.-
OS name: "linux", version: "2.6.32-220.el6.x86_64", arch: "amd64", family: "unix"
3.安装protobuf
没装 protobuf,后面编译做不完,结果如下:
[INFO] —hadoop-maven-plugins:2.4.:protoc (compile-protoc) @ hadoop-common — [WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program “protoc”: error=, No suchfile or directory [ERROR] stdout: [] …………………… [INFO] Apache Hadoop Main………………………….. SUCCESS [.672s] [INFO] Apache Hadoop Project POM……………………. SUCCESS [.682s] [INFO] Apache Hadoop Annotations……………………. SUCCESS [.921s] [INFO] Apache Hadoop Assemblies…………………….. SUCCESS [.676s] [INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [.590s] [INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [.172s] [INFO] Apache Hadoop Auth………………………….. SUCCESS [.123s] [INFO] Apache Hadoop Auth Examples………………….. SUCCESS [.170s] [INFO] Apache HadoopCommon ………………………… FAILURE [.224s] [INFO] Apache Hadoop NFS…………………………… SKIPPED [INFO] Apache Hadoop Common Project…………………. SKIPPED [INFO] Apache Hadoop HDFS………………………….. SKIPPED [INFO] Apache Hadoop HttpFS………………………… SKIPPED [INFO] Apache Hadoop HDFS BookKeeperJournal …………. SKIPPED [INFO] Apache Hadoop HDFS-NFS………………………. SKIPPED [INFO] Apache Hadoop HDFS Project…………………… SKIPPED
安装protobuf过程
下载:
root@VM_160_34_centos:~/tools> wget -c https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
解压
root@VM_160_34_centos:~/tools> tar -xvzf protobuf-2.5..tar.gz
root@VM_160_34_centos:~/tools/protobuf-2.5.> cd protobuf-2.5.
依次执行下面的命令即可
./configure make make check make install
测试安装:
root@VM_160_34_centos:~/tools/releaseprotoc -version
protoc: error while loading shared libraries: libprotobuf.so.: cannot open shared object file: No such file or directory
这里报错 解决办法
root@VM_160_34_centos:~/tools/release-2.4.> cat >> /etc/profile.d/protoc-development.sh << end
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
> end
root@VM_160_34_centos:~/tools/release-2.4.> source /etc/profile
测试结果
root@VM_160_34_centos:~/tools/release-2.4.> protoc --version
libprotoc 2.5.
libprotoc 2.5.0
重新checkout source code
svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.4.0/
加上编译native的选项,编译时会根据当前的操作系统架构来生产相应的native库
mvn package -Pdist,native -DskipTests -Dtar
验证一下:
root@VM_160_34_centos:~/tools/release-2.4.>cd hadoop-dist/target/hadoop-2.4./lib/native
root@VM_160_34_centos:~/tools/release-2.4./hadoop-dist/target/hadoop-2.4./lib/native> file libhadoop.so.1.0.
libhadoop.so.1.0.: ELF -bit LSB shared object, x86-, version (SYSV), dynamically linked, not stripped
目录下hadoop-2.4.0.tar.gz也有了,以后应该就可以直接用了。
感谢 : http://www.kankanews.com/ICkengine/archives/81648.shtml