Flink入门第六课:Flink DataStream api实现将数据写入Kafka\Redis\ES\JDBC

1、KafkaSink 

要先加入kafka的连接器依赖。



    org.apache.flink
    flink-connector-kafka-0.11_2.11
    1.10.1
package com.atguigu.Adatastream_api.sink;

import com.atguigu.Fbeans.SensorReading;
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010;

/**
 * 需要引入kafka-connector的依赖
 * 运行时先启动zkServer,再启动kafka-server,最后启动kafka-consumer,在consumer中消费本类sink的数据
 * 本类和CKafkaSource类合并在一起相当于就成了一个实时ETL的类了,有兴趣自己整理一下。
 * zkServer.sh start
 * kafka-server-start.sh config/server.properties &
 * kafka-console-consumer.sh  --bootstrap-server Linux001:9092 \
 *      --consumer.config  配置文件 --from-beginning  --topic t001    会阻塞
 */
public class AKafkaSink {
    public static void main(String[] args) throws Exception {
        //创建环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        env.setParallelism(1);
        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);

        //读取数据
        DataStreamSource inputStream = env.readTextFile("G:\\SoftwareInstall\\idea\\project\\UserBehaviorAnalysis\\BasicKnowledge\\src\\main\\resources\\sensor.txt");
        DataStream result = inputStream.map(line -> {
            String[] splits = line.split(",");
            //这儿toString是为了数据传输时方便使用simpleStringSchema
            return new SensorReading(new String(splits[0]), new Long(splits[1]), new Double(splits[2])).toString();
        });
        //sink到kafka的参数:brokerlist 、topic  、序列化方式
        result.addSink(new FlinkKafkaProducer010("localhost:9092", "sinkTest", new SimpleStringSchema()));
        env.execute("KafkaSink test");
    }
}

2、redis sink

需要添加redis连接器的依赖

        
        
         

你可能感兴趣的:(Flink从入门到精通,Flink,Redis,ES,JDBC,Kafka)