使用教程为官方教程,附链接:
http://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/SingleCluster.html
·单机模式已验证,没问题
·伪分布式:
1.sbin/start-dfs.sh 敲完这条命令后提示如下内容:
localhost: mkdir: 无法创建目录"/usr/local/hadoop2/logs": 权限不够
localhost: chown: 无法访问'/usr/local/hadoop2/logs': 没有那个文件或目录
localhost: starting namenode, logging to /usr/local/hadoop2/logs/hadoop-zjq-namenode-zjq-virtual-machine.out
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 159: /usr/local/hadoop2/logs/hadoop-zjq-namenode-zjq-virtual-machine.out: 没有那个文件或目录
localhost: head: 无法打开'/usr/local/hadoop2/logs/hadoop-zjq-namenode-zjq-virtual-machine.out' 读取数据: 没有那个文件或目录
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 177: /usr/local/hadoop2/logs/hadoop-zjq-namenode-zjq-virtual-machine.out: 没有那个文件或目录
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 178: /usr/local/hadoop2/logs/hadoop-zjq-namenode-zjq-virtual-machine.out: 没有那个文件或目录
localhost: mkdir: 无法创建目录"/usr/local/hadoop2/logs": 权限不够
localhost: chown: 无法访问'/usr/local/hadoop2/logs': 没有那个文件或目录
localhost: starting datanode, logging to /usr/local/hadoop2/logs/hadoop-zjq-datanode-zjq-virtual-machine.out
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 159: /usr/local/hadoop2/logs/hadoop-zjq-datanode-zjq-virtual-machine.out: 没有那个文件或目录
localhost: head: 无法打开'/usr/local/hadoop2/logs/hadoop-zjq-datanode-zjq-virtual-machine.out' 读取数据: 没有那个文件或目录
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 177: /usr/local/hadoop2/logs/hadoop-zjq-datanode-zjq-virtual-machine.out: 没有那个文件或目录
localhost: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 178: /usr/local/hadoop2/logs/hadoop-zjq-datanode-zjq-virtual-machine.out: 没有那个文件或目录
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: 无法创建目录"/usr/local/hadoop2/logs": 权限不够
0.0.0.0: chown: 无法访问'/usr/local/hadoop2/logs': 没有那个文件或目录
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop2/logs/hadoop-zjq-secondarynamenode-zjq-virtual-machine.out
0.0.0.0: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 159: /usr/local/hadoop2/logs/hadoop-zjq-secondarynamenode-zjq-virtual-machine.out: 没有那个文件或目录
0.0.0.0: head: 无法打开'/usr/local/hadoop2/logs/hadoop-zjq-secondarynamenode-zjq-virtual-machine.out' 读取数据: 没有那个文件或目录
0.0.0.0: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 177: /usr/local/hadoop2/logs/hadoop-zjq-secondarynamenode-zjq-virtual-machine.out: 没有那个文件或目录
0.0.0.0: /usr/local/hadoop2/sbin/hadoop-daemon.sh: 行 178: /usr/local/hadoop2/logs/hadoop-zjq-secondarynamenode-zjq-virtual-machine.out: 没有那个文件或目录
我的解决方法,修改了一下文件目录的权限
sudo chmod -R a+w ${HADOOP_HOME}
参考→:https://www.oschina.net/question/815959_76093
2.敲 bin/hdfs dfs -put etc/hadoop input 命令后出现如下提示:
zjq@zjq-virtual-machine:/usr/local/hadoop2$ bin/hdfs dfs -put etc/hadoop input
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573)
17/05/02 20:44:05 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:06 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
17/05/02 20:44:07 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:577)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:573)
解决:修改 bin/hdfs dfs -put etc/hadoop input 命令
为 bin/hdfs dfs -put etc/hadoop /user/
回车后提示要复制的文件已存在,无需理会。继续执行下一条语句
总结命令:
1.ssh localhost
2. bin/hdfs dfs namenode -format
3. sbin/start-dfs.sh
4.sbin/start-yarn.sh
***************
第4步之前选择性(hdfs文件目录是否存在)执行如下两条命令
$ bin/hdfs dfs -mkdir /user
$ bin/hdfs dfs -mkdir /user/
***************
附:2.7.3版本文件系统shell命令:http://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/FileSystemShell.html