Apache Kafka是分布式发布-订阅消息系统。它最初由LinkedIn公司开发,之后成为Apache项目的一部分。Kafka是一种快速、可扩展的、设计内在就是分布式的,分区的和可复制的提交日志服务。
kafka架构几个重要组件:
官网下载版本号:kafka_2.11-0.10.2.0.tgz
a:解压
tar -xzf kafka_2.11-0.10.2.0.tgz
b:启动服务
启动ZooKeeper(kafka压缩包中依赖了zooKeeper):
bin/zookeeper-server-start.sh config/zookeeper.properties
启动kafka:
bin/kafka-server-start.sh config/server.properties
c:创建主题
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
d:查看主题
bin/kafka-topics.sh --list --zookeeper localhost:2181
e:发送消息
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
zhang
f:接收消息
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
zhang
备注:到此我们使用了一个服务代理(Broker),如果想启动一个多broker的集群可以自行配置。
本文使用springboot版本号是1.4.3,采用bean对象注入的形式。
a:pom文件中引入kafka依赖
<dependency>
<groupId>org.springframework.integrationgroupId>
<artifactId>spring-integration-coreartifactId>
<version>4.3.6.RELEASEversion>
<classifier>sourcesclassifier>
dependency>
<dependency>
<groupId>org.springframework.kafkagroupId>
<artifactId>spring-kafkaartifactId>
<version>1.1.1.RELEASEversion>
dependency>
b:yml文件配置
#集合kafka
zhang:
kafka:
binder:
brokers: 192.168.190.110:9092
zk-nodes: 192.168.190.110:2181
group: zhang-group
c:创建KafkaProducersConfig
@Configuration
@EnableKafka
public class KafkaProducerConfig {
@Value("${zhang.kafka.binder.brokers}")
private String brokers;
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<String, String>(producerFactory());
return kafkaTemplate;
}
public ProducerFactory<String, String> producerFactory() {
// set the producer properties
Map<String, Object> properties = new HashMap<String, Object>();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers);
properties.put(ProducerConfig.BATCH_SIZE_CONFIG, 65536);
properties.put(ProducerConfig.LINGER_MS_CONFIG, 1);
properties.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 524288);
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<String, String>(properties);
}
}
d:创建KafkaConsumerConfig
@Configuration
@EnableKafka
public class KafkaConsumerConfig {
@Value("${zhang.kafka.binder.brokers}")
private String brokers;
@Value("${zhang.kafka.group}")
private String group;
@Bean
public KafkaListenerContainerFactory> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
factory.setConcurrency(4);
factory.getContainerProperties().setPollTimeout(4000);
return factory;
}
@Bean
public KafkaListeners kafkaListeners() {
return new KafkaListeners();
}
public ConsumerFactory consumerFactory() {
Map properties = new HashMap();
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers);
properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
properties.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
properties.put(ConsumerConfig.GROUP_ID_CONFIG, group);
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
return new DefaultKafkaConsumerFactory(properties);
}
}
e:创建test测试
@RestController
@RequestMapping("/test")
public class TestController {
@Autowired
KafkaTemplate kafkaTemplate;
private Gson gson = new GsonBuilder().create();
@RequestMapping("/Kafka")
@ResponseBody
public void testkafka() throws Exception {
BiddingUserBean bean = new BiddingUserBean();
bean.setId(110L);
bean.setCproductname("腾讯");
bean.setIphone("18888888888");
kafkaTemplate.send("boot", gson.toJson(bean));
}
}
f:创建Listener 处理消息(bean对象)
public class KafkaListeners {
private Gson gson = new GsonBuilder().create();
@KafkaListener(topics = {"boot"})
public void processMessage(String content) {
BiddingUserBean m = gson.fromJson(content, BiddingUserBean.class);
System.out.println(m.toString());
}
}
代码流程理解:
测试访问地址:http://localhost:8099/test/Kafka
返回结果:
BiddingUserBean{id=110, iphone='18888888888',cproductname='腾讯'}
楼主在学习过程中借鉴一些大牛博客,下面是借鉴博客地址:
http://blog.csdn.net/Eacter/article/details/73610212
http://www.jianshu.com/p/ed9055bc68a6