
1. 安装
这里以安装hadoop-0.20.2为例
先安装java,参考这个
去着下载hadoop
解压
tar -xzf hadoop-0.20.
2. 配置
修改环境变量
vim ~/.bashrc
export HADOOP_HOME=/home/rte/hadoop-0.20.2 #这里为实际hadoop解压的目录位置
export PATH=$PATH:$HADOOP_HOME/bin
source ~/.bashrc
配置hadoop-env.sh
vim conf/hadoop-env.sh
export JAVA_HOME=/home/rte/Software/java/jdk1..0_27
配置conf/core-site.xml、conf/hdfs-site.xml、conf/mapred-site.xml文件
文件:core-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
文件:mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
文件:hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration>
<property>
<name>dfs.replication</name>
<value>2</value>
</property>
</configuration>
3. 开启关闭
通过NameNode格式化HDFS文件系统
cd hadoop-0.20./conf
hadoop namenode -format
开启hadoop
cd hadoop-0.20./bin
sh start-all.sh
通过jps命令检查期望的hadoop进程是否运行
关闭hadoop
cd hadoop-0.20./bin
sh stop-all.sh
4. 参考