Hadoop启动报错 master: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).

报错信息和截图

[ec2-user@master hadoop]$ sbin/start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as ec2-user in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [master]
master: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting datanodes
Starting secondary namenodes [master]
master: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting resourcemanager
Starting nodemanagers
[ec2-user@master hadoop]$ jps
3576 Jps
3227 ResourceManager

Hadoop启动报错 master: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)._第1张图片

解决方法:

从报错可以看出这个是跟ssh配置的公钥有关系,这个报错我寻思了好久,也看了网上的博主写的解决方法,都没有用,但是后面看了以前写的ssh配置后我灵机一动,就解决了,当时有点高兴,靠自己摸索解决的!

1.首先排除ssh的配置文件的配置问题

 在文件 /etc/ssh/sshd_config文件中

    PasswordAuthentication yes   //开启密码权限
    PubkeyAuthentication yes     //开启公钥登录权限
    PermitRootLogin yes  //开启root权限

2.其次再排除文件权限问题

将每个节点的权限修改为

    chmod 700 ~/.ssh
    chmod 600 ~/.ssh/authorized_keys

 前两个都是看别的博主写的,没有用,最后一个是我自己摸索的

3.确定公钥分发到了每一个节点

ssh-copy-id master

ssh-copy-id slave1

ssh-copy-id slave2

当时就是分发了另外两个节点,自身的节点没有分发

再重新启动就成功啦

Hadoop启动报错 master: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)._第2张图片

你可能感兴趣的:(报错问题,hadoop,eclipse,大数据)