安装了kafak2.3.0的前提下,并且已经打开服务。安装教程:https://mp.csdn.net/postedit/102837603
pom引入依赖:
pom文件如下:
4.0.0
com.freshbin
spring-boot-01
1.0-SNAPSHOT
org.springframework.boot
spring-boot-starter-parent
2.2.0.RELEASE
org.springframework.kafka
spring-kafka
2.3.0.RELEASE
org.projectlombok
lombok
com.alibaba
fastjson
1.2.49
org.springframework.boot
spring-boot-starter-web
commons-beanutils
commons-beanutils
1.8.0
commons-collections
commons-collections
3.2.1
commons-lang
commons-lang
2.4
commons-logging
commons-logging
1.1.1
net.sf.ezmorph
ezmorph
1.0.6
net.sf.json-lib
json-lib
jdk15
2.2.3
org.apache.poi
poi
4.0.1
org.apache.poi
poi-ooxml
4.0.1
org.apache.poi
poi-scratchpad
4.0.1
org.apache.poi
poi-excelant
4.0.1
org.springframework.boot
spring-boot-maven-plugin
spring-snapshots
https://repo.spring.io/snapshot
true
spring-milestones
https://repo.spring.io/milestone
spring-snapshots
https://repo.spring.io/snapshot
spring-milestones
https://repo.spring.io/milestone
不知道为什么,pom文件会有个红×,但是不影响项目运行。
发送消息的数据类UserLog:
package com.freshbin.example01.kafka.bean;
import lombok.Data;
import lombok.experimental.Accessors;
/**
* @Author freshbin
* @Description 定义用户发送的日志数据
* @Date 2019-10-31 15:42:14
*/
@Data
@Accessors(chain = true)
public class UserLog {
private String username;
private String userid;
private String state;
}
生产者类UserLogProducer:
package com.freshbin.example01.kafka.producer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Component;
import com.alibaba.fastjson.JSON;
import com.freshbin.example01.kafka.bean.UserLog;
/**
* @Author freshbin
* @Description
* @Date 2019-10-31 15:46:16
*/
@Component
public class UserLogProducer {
@Autowired
private KafkaTemplate kafkaTemplate;
/**
* 发送数据
* @param userid
*/
public void sendLog(String userid){
UserLog userLog = new UserLog();
userLog.setUsername("freshbin").setUserid(userid).setState("0");
System.out.println("发送用户日志数据:" + userLog);
kafkaTemplate.send("user-log", JSON.toJSONString(userLog));
}
}
消费者类UserLogConsumer:
package com.freshbin.example01.kafka.comsumer;
import java.util.Optional;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Component;
import lombok.extern.slf4j.Slf4j;
/**
* @Author freshbin
* @Description
* @Date 2019-10-31 15:49:23
*/
@Component
@Slf4j
public class UserLogConsumer {
@KafkaListener(topics = { "user-log" })
public void consumer(ConsumerRecord, ?> consumerRecord) {
// 判断是否为null
Optional> kafkaMessage = Optional.ofNullable(consumerRecord.value());
log.info(">>>>>>>>>> record =" + kafkaMessage);
if (kafkaMessage.isPresent()) {
// 得到Optional实例中的值
Object message = kafkaMessage.get();
System.out.println("消费消息:" + message);
}
}
}
启动主类KafkaApplication,直接在init方法中发送消息:
package com.freshbin.example01.kafka;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
import com.freshbin.example01.kafka.producer.UserLogProducer;
import javax.annotation.PostConstruct;
/**
* @Author freshbin
* @Description
* @Date 2019-10-31 15:46:07
*/
@SpringBootApplication(exclude = {DataSourceAutoConfiguration.class})
public class KafkaApplication {
@Autowired
private UserLogProducer kafkaSender;
@PostConstruct
public void init(){
for (int i = 0; i < 10; i++) {
//调用消息发送类中的消息发送方法
kafkaSender.sendLog(String.valueOf(i));
System.out.println("已经发送了一个消息:" + i);
}
}
public static void main(String[] args) {
SpringApplication.run(KafkaApplication.class, args);
}
}
application.properties文件配置如下:
spring.application.name=kafka-user
server.port=8080
#============== kafka ===================
# 指定kafka 代理地址,可以多个
spring.kafka.bootstrap-servers=localhost:9092
#=============== provider =======================
spring.kafka.producer.retries=0
# 每次批量发送消息的数量
spring.kafka.producer.batch-size=16384
spring.kafka.producer.buffer-memory=33554432
# 指定消息key和消息体的编解码方式
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
#=============== consumer =======================
# 指定默认消费者group id
spring.kafka.consumer.group-id=user-log-group
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.enable-auto-commit=true
spring.kafka.consumer.auto-commit-interval=100
# 指定消息key和消息体的编解码方式
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
启动之后控制台就会打印相应的信息:
同时zk和kafka的dos窗口也会打印一些东西,不过没有仔细看打印了什么,反正也看不懂的。
虽然springboot整合了kafka,感觉demo还挺简单的,但是还不知道如何应用到真实的项目中。