When I try to start dfs using:
当我尝试开始使用dfs时:
start-dfs.sh
I get an error saying :
我犯了一个错误:
14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable Starting namenodes on [OpenJDK 64-Bit Server VM
warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'. localhost] sed: -e
expression #1, char 6: unknown option to `s' Server: ssh: Could not
resolve hostname Server: Name or service not known
-c: Unknown cipher type 'cd' stack: ssh: Could not resolve hostname stack: Name or service not known 64-Bit: ssh: Could not resolve
hostname 64-Bit: Name or service not known guard.: ssh: Could not
resolve hostname guard.: Name or service not known The: ssh: Could not
resolve hostname The: Name or service not known guard: ssh: Could not
resolve hostname guard: Name or service not known might: ssh: Could
not resolve hostname might: Name or service not known stack: ssh:
Could not resolve hostname stack: Name or service not known will: ssh:
Could not resolve hostname will: Name or service not known the: ssh:
Could not resolve hostname the: Name or service not known fix: ssh:
Could not resolve hostname fix: Name or service not known VM: ssh:
Could not resolve hostname VM: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known It's: ssh:
Could not resolve hostname It's: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
localhost: namenode running as process 4463. Stop it first. library:
ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known VM: ssh: Could not resolve hostname VM: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known loaded: ssh: Could not resolve hostname loaded: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known <libfile>',: ssh: Could not resolve hostname
<libfile>',: Name or service not known to: ssh: connect to host to
port 22: Connection refused OpenJDK: ssh: Could not resolve hostname
OpenJDK: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known with: ssh: Could not resolve
hostname with: Name or service not known fix: ssh: Could not resolve
hostname fix: Name or service not known noexecstack'.: ssh: Could not
resolve hostname noexecstack'.: Name or service not known that: ssh:
Could not resolve hostname that: Name or service not known you: ssh:
Could not resolve hostname you: Name or service not known or: ssh:
Could not resolve hostname or: Name or service not known highly: ssh:
Could not resolve hostname highly: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or
service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known link: ssh: Could not resolve
hostname link: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known '-z: ssh: Could not resolve
hostname '-z: Name or service not known localhost: datanode running as
process 4561. Stop it first. Starting secondary namenodes [OpenJDK
64-Bit Server VM warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
0.0.0.0] sed: -e expression #1, char 6: unknown option to `s' OpenJDK: ssh: Could not resolve hostname OpenJDK: Name or service not known
-c: Unknown cipher type 'cd' VM: ssh: Could not resolve hostname VM: Name or service not known The authenticity of host '0.0.0.0 (0.0.0.0)'
can't be established. ECDSA key fingerprint is
dd:64:53:7e:c0:62:40:c0:63:2b:5c:6d:1e:b6:cd:23. Are you sure you want
to continue connecting (yes/no)? might: ssh: Could not resolve
hostname might: Name or service not known Server: ssh: Could not
resolve hostname Server: Name or service not known guard.: ssh: Could
not resolve hostname guard.: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known The: ssh:
Could not resolve hostname The: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known will: ssh: Could not resolve hostname will: Name or service
not known the: ssh: Could not resolve hostname the: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known that: ssh: Could not resolve hostname that: Name or
service not known highly: ssh: Could not resolve hostname highly: Name
or service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known try: ssh: Could not resolve
hostname try: Name or service not known guard: ssh: Could not resolve
hostname guard: Name or service not known 64-Bit: ssh: Could not
resolve hostname 64-Bit: Name or service not known loaded: ssh: Could
not resolve hostname loaded: Name or service not known library: ssh:
Could not resolve hostname library: Name or service not known fix:
ssh: Could not resolve hostname fix: Name or service not known to:
ssh: connect to host to port 22: Connection refused link: ssh: Could
not resolve hostname link: Name or service not known stack: ssh: Could
not resolve hostname stack: Name or service not known '-z: ssh: Could
not resolve hostname '-z: Name or service not known you: ssh: Could
not resolve hostname you: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known recommended: ssh:
Could not resolve hostname recommended: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name
or service not known or: ssh: Could not resolve hostname or: Name or
service not known noexecstack'.: ssh: Could not resolve hostname
noexecstack'.: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known ^C0.0.0.0: Host key
verification failed. ^C
My core-site.xml file contains this:
我的核心位点。xml文件包含:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
My .profile
(replacement for .bashrc
) contains these lines:
My .profile(替换.bashrc)包含以下几行:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
And I can easily ssh my localhost saying:
我可以很容易的ssh localhost说:
ssh localhost
Welcome to Linux Mint 16 Petra (GNU/Linux
3.11.0-12-generic x86_64)
Welcome to Linux Mint * Documentation: http://www.linuxmint.com Last
login: Wed Jul 2 16:51:15 2014 from localhost
3 个解决方案
#1
6
Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.
停止JVM将堆栈保护警告打印到stdout/stderr,因为这会破坏HDFS启动脚本。
Do it by replacing in your etc/hadoop/hadoop-env.sh
line:
通过替换你的etc/hadoop/hadoop-env来完成它。承宪:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
with:
:
export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"
(This solution has been found on Sumit Chawla's blog)
(这个解决方案已经在Sumit Chawla的博客上找到了)
#2
1
Edit your .bashrc file and add the following lines:
编辑您的.bashrc文件并添加以下代码:
export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
And although your ssh should be working by what you have just said, do it again just in case:
虽然你的ssh应该工作到你刚才说的话,做了一次,以防:
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
#3
1
It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...
似乎您没有在.profile文件中添加指向您的主要hadoop文件夹的$HADOOP_INSTALL行。正如Balduz建议的,使用hadoop - home将代替$HADOOP_INSTALL变量。我会用他的建议,但你也可以加上…
export HADOOP_INSTALL=/path/to/hadoop/
#1
6
Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.
停止JVM将堆栈保护警告打印到stdout/stderr,因为这会破坏HDFS启动脚本。
Do it by replacing in your etc/hadoop/hadoop-env.sh
line:
通过替换你的etc/hadoop/hadoop-env来完成它。承宪:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
with:
:
export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"
(This solution has been found on Sumit Chawla's blog)
(这个解决方案已经在Sumit Chawla的博客上找到了)
#2
1
Edit your .bashrc file and add the following lines:
编辑您的.bashrc文件并添加以下代码:
export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
And although your ssh should be working by what you have just said, do it again just in case:
虽然你的ssh应该工作到你刚才说的话,做了一次,以防:
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
#3
1
It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...
似乎您没有在.profile文件中添加指向您的主要hadoop文件夹的$HADOOP_INSTALL行。正如Balduz建议的,使用hadoop - home将代替$HADOOP_INSTALL变量。我会用他的建议,但你也可以加上…
export HADOOP_INSTALL=/path/to/hadoop/