DataStream API 之 sink(八)

writeAsText (流式处理很少用写成文件)

      将元素以字符串形式逐行写入,这些字符串通过调用每个元素的toString()方法来获取

 

print() / printToErr() : 打印每个元素的 toString() 方法的值到标准输出或者标准错误输出流中

 

自定义输出addSink【kafka、redis】

        
        
            org.apache.bahir
            flink-connector-redis_2.11
            1.0
        

样例代码:

/**
 *  接收socket数据,把数据保存到redis中
 *
 *  list
 *
 *  lpush list key value
 *
 */
public class StreamingDemoToRedis {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        DataStreamSource text = env.socketTextStream("hadoop",9000,"\n");

        //lpush l_words word

        // 对数据进行组装
        DataStream> l_wordsData = text.map(new MapFunction>() {
            @Override
            public Tuple2 map(String value) throws Exception {
                return new Tuple2<>("l_words",value);
            }
        });

        //创建 redis 配置
       FlinkJedisPoolConfig conf =  new FlinkJedisPoolConfig.Builder().setHost("hadoop").setPort(6379).build();

        // 创建redisSink
        RedisSink> redisSink = new RedisSink>(conf,new MyRedisMapper());

        l_wordsData.addSink(redisSink);

        env.execute("StreamingDemoToRedis");
    }

    public static class MyRedisMapper implements RedisMapper>{

        @Override
        public RedisCommandDescription getCommandDescription() {
            return null;
        }

        //表示从接收的数据中获取需要操作的redis Key
        @Override
        public String getKeyFromData(Tuple2 data) {
            return data.f0;
        }

        //表示从接收的数据中获取需要操作的redis Value
        @Override
        public String getValueFromData(Tuple2 data) {
            return data.f1;
        }
    }
}

 

你可能感兴趣的:(个人日记)