hdfs开启高可用之后HIVE报错 Operation category READ is not supported in state standby

hdfs开启高可用依赖存储hdfs的组件报错

问题:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:88) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1951) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1427) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3100) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154) INFO hive.HiveImport: at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966) INFO hive.HiveImport: at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) INFO hive.HiveImport: at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) INFO hive.HiveImport: at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) INFO hive.HiveImport: at java.security.AccessController.doPrivileged(Native Method) INFO hive.HiveImport: at javax.security.auth.Subject.doAs(Subject.java:422) INFO hive.HiveImport: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) INFO hive.HiveImport: ) INFO hive.HiveImport: INFO : Completed executing command(queryId=hive_20190121105710_16d0a568-83f6-421f-810d-da799244692e); Time taken: 0.043 seconds INFO hive.HiveImport: Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:88) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1951) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1427) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3100) INFO hive.HiveImport: at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154) INFO hive.HiveImport: at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966) INFO hive.HiveImport: at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) INFO hive.HiveImport: at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) INFO hive.HiveImport: at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) INFO hive.HiveImport: at java.security.AccessController.doPrivileged(Native Method) INFO hive.HiveImport: at javax.security.auth.Subject.doAs(Subject.java:422) INFO hive.HiveImport: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) INFO hive.HiveImport: at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) INFO hive.HiveImport: )
关键
:Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error

这里要说一下网上说的都是如何切换hdfs namenone其实是治标不治本
而且没有办法完成hdfs自动切换,尤其是hive。根本的解决办法是将所有指向原hdfs地址的ip修改为HA后的新hdfs组ID才可以。

解决:
1.我用的hive在读取元数据时出错,修改hive元数据SDS表中的LOCATION字段,将原有指向修改为dfs.nameservices的值
例子:

dfs.nameservices=NameNode
				原地址
				hdfs://NameNode01:8020/tmp
					修改为
				 hdfs://NameNode/tmp

2.hive的catlog地址同上,在hive元数据ctlgs表中
3.hive的database地址同上,在hive元数据dbs表中

你可能感兴趣的:(HDP-HIVE3)