/sbin/start-dfs.sh

 ./sbin/start-dfs.sh Starting namenodes on [hadoop01] ERROR: Attempting to operate on hdfs namenode as root ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation. Starting datanodes ERROR: Attempting to operate on hdfs datanode as root ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation. Starting secondary namenodes [hadoop03] ERROR: Attempting to operate on hdfs secondarynamenode as root ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.

推荐方案:使用 Hadoop 专用用户运行

最佳实践是使用专用的 Hadoop 用户(而不是 root)来运行 Hadoop 服务:

 
   

bash

# 切换到您的 Hadoop 用户(通常是 'hadoop' 或 'hdfs')
su - hadoop
# 然后尝试再次启动
/sbin/start-dfs.sh

2. 替代方案:配置环境变量

如果必须使用 root 用户运行(不推荐),可以定义所需的环境变量:

 
   

bash

export HDFS_NAMENODE_USER="root"
export HDFS_DATANODE_USER="root"
export HDFS_SECONDARYNAMENODE_USER="root"
/sbin/start-dfs.sh

 cd /usr/local/hadoop/etc/hadoop
 vim hadoop-env.sh

export HDFS_NAMENODE_USER="root"
export HDFS_DATANODE_USER="root"
export HDFS_SECONDARYNAMENODE_USER="root"

在后面 start-yarn的时候还会报错,这时候再在后面加一条

export YARN_RESOURCEMANAGER_USER="root"
export YARN_NODEMANAGER_USER="root"

你可能感兴趣的:(hadoop)