flume系列之:Error while trying to hflushOrSync,查看hdfs生成的文件发现文件损坏

flume系列之:Error while trying to hflushOrSync,查看hdfs生成的文件发现文件损坏

  • 一、flume完整报错如下所示
  • 二、追查报错原因
  • 三、问题产生原因总结
  • 四、报错解决方法

一、flume完整报错如下所示

flume日志一直报如下错误:
13 Mar 2022 10:45:30,319 ERROR [hdfs-sink1-call-runner-400] (org.apache.flume.sink.hdfs.AbstractHDFSWriter.hflushOrS ync:269) - Error while trying to hflushOrSync!
264 25 Mar 2022 10:45:30,319 WARN [hdfs-sink1-roll-timer-3] (org.apache.flume.sink.hdfs.BucketWriter$CloseHandler.close :396) - Closing file: /optics-prod/raw/kafka/debezium-prod-optics_prod_1h/optics_prod/order/1h/20220123/13/debezium-flume-001.prod-flumedata.1648026226340.gz.tmp failed. Will retry again in 180 seconds.
265 java.io.IOException: Failing write. Tried pipeline recovery 5 times without success.
266 at org.apache.hadoop.hd

你可能感兴趣的:(flume,flume系列,hflushOrSync,文件损坏)