windows单机安装hadoop 3.1.0。建议将java换成java8。最开始用的java10各种问题
到hadoop官网下载。
windows的需要下载一个这个https://github.com/steveloughran/winutils;然后根据自己的hadoop版本,替换掉bin中的文件。
然后将hadoop解压。之后将解压目录设置环境变量HADOOP_HOME。然后配置配置文件。
在sbing文件夹中启动。可以吧bin文件夹加到Path。方便使用命令。
错误1:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.(StringUtils.java:80)
at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2807)
at org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2802)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2668)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at com.wxe.app.WordCount.main(WordCount.java:62)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3107)
at java.base/java.lang.String.substring(String.java:1873)
at org.apache.hadoop.util.Shell.(Shell.java:51)
... 6 more
换成java8吧。
错误2:
启动:
nodemanage报错 && resourceManager报错
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NoClassDefFoundError: javax/activation/DataSource
java9可以在yarn-env添加下列参数解决。java10本人亲测不行。最好解决方案建议换成java8
export YARN_RESOURCEMANAGER_OPTS=”–add-modules=ALL-SYSTEM”
export YARN_NODEMANAGER_OPTS=”–add-modules=ALL-SYSTEM”
错误3:
Exception in thread "main" java.io.IOException: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "mzmuer/172.19.82.225"; destination host is: "localhost":9870;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:808)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1495)
at org.apache.hadoop.ipc.Client.call(Client.java:1437)
at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy11.getBlockLocations(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:305)
解决方法:
1.通过jps查看namenode和datanode是否启动。未启动需要调用start-dfs启动。
2.查看core-site.xml配置文件中的fs.default.name属性。是否是请求的链接。请求到了错误的链接也可能是该错误.
警告4:
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
解决参考:https://stackoverflow.com/questions/6608775/please-initialize-the-log4j-system-properly-while-running-web-service
我是maven项目,在resources目录下增加log4j.properties文件解决。
# Root logger option
log4j.rootLogger=DEBUG, stdout, file
# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
# Redirect log messages to a log file, support file rolling.
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=logs/log4j-application.log
log4j.appender.file.MaxFileSize=5MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
错误5:
windows下运行MapReduce程序。
Exception message: CreateSymbolicLink error (1314): ???????????
权限问题。解决方法https://stackoverflow.com/questions/28958999/hdfs-write-resulting-in-createsymboliclink-error-1314-a-required-privilege
1.win+R gpedit.msc
2. 计算机配置->windows设置->安全设置->本地策略->用户权限分配->创建符号链接。
3. 把用户添加进去,重启或者注销。