org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:

1.调度正常运行时报错如下,报错详情

Error: java.io.IOException: java.io.IOException: org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1555360483-10.68.9.242-1591167227146:blk_1103672895_29932280 file=/user/hive/warehouse/offline.db/ods/ods_offline_order_detail_daily/000014_0
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:232)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:142)
	at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:205)
	at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:191)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:465)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: java.io.IOException: org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1555360483-10.68.9.242-1591167227146:blk_1103672895_29932280 file=/user/hive/warehouse/offline.db/ods/ods_offline_order_detail_daily/000014_0
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
	at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:365)
	at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:116)
	at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:43)
	at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:116)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:229)
	... 11 more

2.查看CDH报错发现node03上的DataNode起不来, 查看node03的磁盘发现少了350G

org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:_第1张图片
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:_第2张图片
进入对应机器的/dfs目录、报错ls:cannot open directory :Input/output error.
报错原因为磁盘损坏,导致对应dataNode去取不到数据。服务起不来。
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:_第3张图片
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:_第4张图片

3.报错解决

反馈服务器运维人员,将对应机器重启后,磁盘自动修复。 后续将集群重启后恢复正常运转

你可能感兴趣的:(CDH,hadoop,hdfs,apache)