大数据 - HIVE3.1.0安装部署 - 初始化报错:Illegal character entity: expansion character (code 0x8

操作指令

# 执行如下的初始化脚本时报错: 
schematool -dbType derby  -initSchema --verbose

错误内容

[root@master apache-hive-3.1.0-bin]# schematool -dbType derby  -initSchema --verbose
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/rills/software/apache-hive-3.1.0-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/rills/software/hadoop-3.1.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
 at [row,col,system-id]: [3213,96,"file:/opt/rills/software/apache-hive-3.1.0-bin/conf/hive-site.xml"]
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3003)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2931)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2806)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1460)
	at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:4990)
	at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:5063)
	at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5150)
	at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5098)
	at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
	at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
 at [row,col,system-id]: [3213,96,"file:/opt/rills/software/apache-hive-3.1.0-bin/conf/hive-site.xml"]
	at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
	at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
	at com.ctc.wstx.sr.StreamScanner.reportIllegalChar(StreamScanner.java:2456)
	at com.ctc.wstx.sr.StreamScanner.validateChar(StreamScanner.java:2403)
	at com.ctc.wstx.sr.StreamScanner.resolveCharEnt(StreamScanner.java:2369)
	at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java:1515)
	at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2828)
	at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123)
	at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3257)
	at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3063)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2986)
	... 15 more
[root@master apache-hive-3.1.0-bin]# 

解决方案

打开文件,跳转到3213行,删除即可。

你可能感兴趣的:(大数据相关,大数据,hadoop,hive)