hbase报错:java.io.IOException: Got error for OP_READ_BLOCK

2020-01-16 14:57:32,689 WARN [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] hdfs.BlockReaderFactory: I/O error constructing remote block reader.
java.io.IOException: Got error for OP_READ_BLOCK, status=ERROR

写入数据的时候,hdfs阻塞了, 需要等待一会, 报错如下, 阻塞的原因可能是在hbase正在做split,compaction。

2020-01-16 14:57:32,687 INFO  [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] regionserver.HStore: Validating hfile at hdfs://migumaster/pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top for inclusion in store m region migu:download_log20200116,66,1579103995918.8738786b95593adeafa7fa20bc92cc8e.
2020-01-16 14:57:32,689 WARN  [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] hdfs.BlockReaderFactory: I/O error constructing remote block reader.
java.io.IOException: Got error for OP_READ_BLOCK, status=ERROR, self=/10.186.59.94:45870, remote=/10.186.59.90:50010, for file /pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top, for pool BP-266398130-10.186.59.129-1574389974472 block 1087859234_14161728
    at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
    at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:881)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:759)
    at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:376)
    at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:652)
    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:879)
    at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:932)
    at java.io.DataInputStream.readFully(DataInputStream.java:195)
    at org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTrailer.java:391)
    at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:482)
    at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:540)
    at org.apache.hadoop.hbase.regionserver.HStore.assertBulkLoadHFileOk(HStore.java:734)
    at org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFiles(HRegion.java:5350)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.bulkLoadHFile(RSRpcServices.java:1950)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33650)
    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2171)
    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:109)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:185)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:165)
2020-01-16 14:57:32,689 WARN  [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] hdfs.DFSClient: Failed to connect to /10.186.59.90:50010 for block, add to deadNodes and continue. java.io.IOException: Got error for OP_READ_BLOCK, status=ERROR, self=/10.186.59.94:45870, remote=/10.186.59.90:50010, for file /pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top, for pool BP-266398130-10.186.59.129-1574389974472 block 1087859234_14161728
java.io.IOException: Got error for OP_READ_BLOCK, status=ERROR, self=/10.186.59.94:45870, remote=/10.186.59.90:50010, for file /pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top, for pool BP-266398130-10.186.59.129-1574389974472 block 1087859234_14161728
    at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
    at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:881)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:759)
    at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:376)
    at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:652)
    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:879)
    at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:932)
    at java.io.DataInputStream.readFully(DataInputStream.java:195)
    at org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTrailer.java:391)
    at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:482)
    at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:540)
    at org.apache.hadoop.hbase.regionserver.HStore.assertBulkLoadHFileOk(HStore.java:734)
    at org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFiles(HRegion.java:5350)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.bulkLoadHFile(RSRpcServices.java:1950)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33650)
    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2171)
    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:109)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:185)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:165)
2020-01-16 14:57:32,726 INFO  [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] regionserver.HStore: Loaded HFile hdfs://migumaster/pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top into store 'm' as hdfs://migumaster/hbase/data/migu/download_log20200116/8738786b95593adeafa7fa20bc92cc8e/m/6738bc2bfdde45498adcddde4c149d66_SeqId_549_ - updating store file list.
2020-01-16 14:57:32,734 INFO  [RpcServer.FifoWFPBQ.priority.handler=17,queue=1,port=6201] regionserver.HStore: Successfully loaded store file hdfs://migumaster/pub_stat_migu/hbasetmp/m/.tmp/369d4d2a859c475491118a50d5e6ae02.top into store m (new location: hdfs://migumaster/hbase/data/migu/download_log20200116/8738786b95593adeafa7fa20bc92cc8e/m/6738bc2bfdde45498adcddde4c149d66_SeqId_549_)

你可能感兴趣的:(hbase)