kafka-console-producer.sh 发送数据到kafka发生截断(不完整)

问题描述

手动往kafka命令中发送数据,发生截断,导致kafka的数据不完整,flume的采集的数据不完整(以为是flume 采集数据发生truncate):

 /usr/kafka_2.10-0.9.0.1/bin/kafka-console-producer.sh   --topic  dp_monitor_rh_out-20190717   --broker-list   broker-list配置

导致原因

实际是手动复制一行过长数据到kafka-produer.sh会发生截断,需要用脚本发送:

解决方法

  • 引入依赖
  
        
            org.apache.kafka
            kafka_2.11
            0.9.0.0
        

        
        
            org.apache.kafka
            kafka-clients
            0.9.0.0
        
  • 发送代码
package com.myd.cn;



import kafka.javaapi.producer.Producer;
import kafka.producer.KeyedMessage;

import kafka.producer.ProducerConfig;
import org.apache.log4j.Logger;

import java.io.*;
import java.util.Properties;



public class KafkaProducerUtil {

    private  static Logger log = Logger.getLogger(KafkaProducerUtil.class);
    public static void sendMessgageFromLogFile(File logFile){
        BufferedInputStream buff = null;




        Properties props = new Properties();
        props.put("metadata.broker.list", "kafak  broker-list配置");

        //配置value的序列化类
        props.put("serializer.class", "kafka.serializer.StringEncoder");
        //配置key的序列化类
        props.put("key.serializer.class", "kafka.serializer.StringEncoder");

        ProducerConfig producerConfig = new ProducerConfig(props);



        Producer producer = new Producer(producerConfig);
        try {
            InputStreamReader isr = new InputStreamReader(new FileInputStream(logFile));
            BufferedReader br = new BufferedReader(isr);

            String line ;
            try {
                //从字节数组byte[] 里面读取数据
                while ((line = br.readLine()) != null){
                    //line = br.readLine() 这个方法是从文件读取每行数据
                    log.info("============发送的数据============ "+line);
                    System.out.println("============发送的数据============ "+line);
                    producer.send(new KeyedMessage("topic",line));



                }
            } catch (IOException e) {
                e.printStackTrace();
            }
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } finally {
            if (null != buff ){
                try {
                    buff.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }
    public static void main(String[] args) {
       sendMessgageFromLogFile(new File(args[0]));


    }

}

  • 使用maven 打包
mvn package (或者idea 界面执行package)
  • 执行命令
java  -jar IdeaProject/kafa/target/kafka-1.0-SNAPSHOT.jar    /home/jerry/kafka.log

你可能感兴趣的:(kafka)