1.安装虚拟机Centos环境
安装镜像下载地址:http://vault.centos.org/6.4/isos/x86_64/
下载上图标出的两个。安装时选第一个就可以了。
安装过程略。可以参见如下文档安装。
http://pan.baidu.com/s/1dDowIjv
我的环境是:
[root@hadoop1 target]# uname -a
Linux hadoop1 2.6.32-358.el6.x86_64 #1 SMP Fri Feb 22 00:31:26 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux
2.安装jdk----jdk-7u67-linux-x64.tar.gz
由于Hadoop是用Java开发的,故编译时要用到jdk.
此次安装的版本是jdk-7u67-linux-x64.tar.gz
jdk安装文件可以到如下网盘地址去下载:
http://pan.baidu.com/s/1jGqUupw
安装方法:
第一步:tar -zxvf jdk-7u67-linux-x64.tar.gz
第二步:mv jdk1.7.0_67 jdk1.7
第三步:vi /etc/profile
添加如下内容:
export JAVA_HOME=/usr/local/jdk1.7
export PATH=.:$PATH:$JAVA_HOME/bin
保存后:source /etc/profile
第四步:java -version
[root@hadoop1 soft]# java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
3.安装maven---apache-maven-3.0.5-bin.tar.gz
由于Hadoop2是用Maven来管理项目的,故要编译源码,需要用Maven
maven下载地址:http://pan.baidu.com/s/1bnlcZeR
当然也可以到官网去下载
安装步骤:
解压并改名
tar -zxvf apache-maven-3.0.5-bin.tar.gz
mv apache-maven-3.0.5 maven
vi /etc/profile
添加如下内容
export MAVEN_HOME=/usr/local/maven
export PATH=.:$PATH:$MAVEN_HOME
source /etc/profile
验证:mvn -version
[root@hadoop1 soft]# mvn -version
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 21:51:28+0800)
Maven home: /usr/local/maven
Java version: 1.7.0_67, vendor: Oracle Corporation
Java home: /usr/local/jdk1.7/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
4.安装findbugs -- findbugs-3.0.0.tar.gz
findbugs是用来生成文档的,如果不需要生成编译文档,此步骤可以忽略。
安装文件下载地址:
http://pan.baidu.com/s/1pJr0dR1
安装步骤:
tar -zxvf findbugs-3.0.0.tar.gz
mv findbugs-3.0.0 findbugs
配置环境变量:
vi /etc/profile
export FINDBUGS_HOME=/usr/local/findbugs
export PATH=.:$PATH:$FINDBUGS_HOME/bin
source /etc/profile
验证:findbugs -version
[root@hadoop1 soft]# findbugs -version
3.0.0
5.安装protoc
由于Hadoop是使用protocol buffer通信的,故要安装protoc.
官网:https://code.google.com/p/protobuf/downloads/list
我的下载地址:http://pan.baidu.com/s/1zshyA
为了安装protoc,需要安装以下几个工具:
前提:需要centos虚拟机可以连网。
yum install gcc安装步骤:
yum intall gcc-c++
yum install make
tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure --prefix=/usr/local/protoc
make
make install
以上执行步骤只要不出错就可以了
编译文件位于:/usr/local/protoc
配置环境变量:
export PROTOC_HOME=/usr/local/protoc
export PATH=.:$PATH:$PROTOC_HOME
source /etc/profile
验证:
[root@hadoop1 soft]# protoc --version
libprotoc 2.5.0
6.安装其它依赖包
yum install cmake7.编译源码
yum install openssl-devel
yum install ncurses-devel
从官网下载Hadoop2.2源码
http://apache.fayea.com/apache-mirror/hadoop/common/hadoop-2.2.0/
或者从下面的网盘下载:
http://pan.baidu.com/s/1dD6DC2l
安装步骤:
tar -zxvf hadoop-2.2.0-src.tar.gz
cd hadoop-2.2.0-src
修复bug一枚
在hadoop-2.2.0-src目录下的如下文件:
/usr/local/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth中的文件pom.xml
在第55行加入如下配置:
<dependency>cd hadoop-2.2.0-src
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
mvn package -DskipTests -Pdist,native,docs
如果没有安装生成文档的findbugs,则去掉后边的docs
由于maven需要上网去下载需要的jar包,帮该命令会执行时间较长。
如果最后控制台输出如下信息:
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [2.625s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [1.623s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [3.604s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.365s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.368s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.922s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [13.938s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.717s]
[INFO] Apache Hadoop Common .............................. SUCCESS [5:56.983s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [12.786s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.097s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [10:03.524s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [39.430s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [13.182s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [6.215s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.179s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.392s]
[INFO] hadoop-yarn-api ................................... SUCCESS [57.293s]
[INFO] hadoop-yarn-common ................................ SUCCESS [38.550s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.608s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.493s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [17.368s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [3.668s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [14.096s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.432s]
[INFO] hadoop-yarn-client ................................ SUCCESS [5.831s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.140s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [3.396s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.148s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [30.974s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [4.917s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.177s]
[INFO] hadoop-yarn-project ............................... SUCCESS [5.514s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.587s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.728s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [12.216s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.015s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [4.927s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.892s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [7.238s]
[INFO] hadoop-mapreduce .................................. SUCCESS [2.417s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [6.014s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [9.235s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.558s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [7.183s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [5.011s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [3.364s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [3.617s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [3.970s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.288s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.060s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [41.536s]
[INFO] Apache Hadoop Client .............................. SUCCESS [6.460s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.586s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:42.823s
[INFO] Finished at: Fri Aug 08 07:08:01 CST 2014
[INFO] Final Memory: 71M/239M
[INFO] ------------------------------------------------------------------------
其中关键字:BUILD SUCCESS
则表示编译完成。
编译后的文件位于:target目录中。
上图标出的就是编译后的Hadoop2.2.