hadoop配置文件解析

hadoop配置

1、hadoop-env.sh

1、配置堆内存大小HADOOP_HAPHSIZE

2、HADOOP_LOG_DIR 配置

3、HADOOP_PID_DIR配置

2、core-site.xml

rpc:主要用于节点之间的通信

http:适量数据 secondataname

tcp: 大文件

fs.defaultFS hdfs://hp01:9000 hadoop.tmp.dir /usr/local/hadoopdata/tmp dfs.blocksize 134217728 io.file.buffer.size 4096

3、vi hdfs-site.xml

dfs.blocksize 134217728 dfs.replication 3 dfs.http.address hp01:50070 dfs.secondary.http.address hp01:50090 dfs.namenode.name.dir file:///usr/local/hadoopdata/dfs/name 可以配多个目录用逗号分开 dfs.datanode.name.dir file:///usr/local/hadoopdata/dfs/data fs.checkpoint.dir file:///usr/local/hadoopdata/dfs/cname fs.checkpoint.edits.dir file:///usr/local/hadoopdata/dfs/cname dfs.permissions false dfs.webhdfs.enabled true

4、vi mapred-site.xml

mapreduce.framework.name yarn mapreduce.jobhistory.address hp01:10020 mapreduce.jobhistory.webapp.address hp01:19888

5、vi yarn-site.xml

mapreduce.jobhistory.webapp.address hp01:19888 yarn.nodemanager.aux-services mapreduce_shuffle yarn.resourcemanager.hostname hp01 yarn.resourcemanager.address hp01:8032 yarn.resourcemanager.scheduler.address hp01:8030 yarn.resourcemanager.resource-tracker.address hp01:8031 yarn.resourcemanager.admin.address hp01:8033 yarn.resourcemanager.webapp.address hp01:8088

6、slaves

配置dataname节点,只在namenode中配置

7、

Safemado is off: 可用副本数/总副本数>99.99999%===>不会进安全模式,否则进入安全模式,不然读写

pp.address
hp01:8088

6、slaves

配置dataname节点,只在namenode中配置

7、

Safemado is off: 可用副本数/总副本数>99.99999%===>不会进安全模式,否则进入安全模式,不然读写

hadoop配置文件解析_第1张图片

你可能感兴趣的:(大数据学习)