Flink读取state的状态

1、复制flink opt目录下的flink-queryable-state-runtime_2.11-1.9.0.jar包到lib目录

2、修改配conf/flink-conf.yaml配置文件

添加如下配置

# queryable-state
queryable-state.enable: true

3、提交如下代码示例

public class WordCount {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        env.enableCheckpointing(5000);
        env.setStateBackend(new RocksDBStateBackend("hdfs://nameservice1/user/hdfs/flink/checkpoints"));

        env
                .socketTextStream("192.168.0.1", 9999)
                .flatMap(new FlatMapFunction>() {
                    @Override
                    public void flatMap(String s, Collector> collector) throws Exception {
                        collector.collect(new Tuple2<>(Long.parseLong(s), 1L));
                    }
                })
                .keyBy(0)
                .flatMap(new CountWindowAverage())
                .keyBy(0)
                .print();
        
        env.execute(WordCount.class.getCanonicalName());
    }

    static class CountWindowAverage extends RichFlatMapFunction, Tuple2> {

        private transient ValueState> sum; // a tuple containing the count and the sum

        @Override
        public void open(Configuration config) {
            ValueStateDescriptor> descriptor =
                    new ValueStateDescriptor<>(
                            "average", // the state name
                            TypeInformation.of(new TypeHint>() {})); // type information
            descriptor.setQueryable("query123");

            sum = getRuntimeContext().getState(descriptor);
        }

        @Override
        public void flatMap(Tuple2 input, Collector> out) throws Exception {
            Tuple2 currentSum = sum.value();
            if (currentSum == null) {
                currentSum = new Tuple2<>(0L, 1L);
            }

            currentSum.f0 += 1;
            currentSum.f1 += 1;
            sum.update(currentSum);

            if (currentSum.f0 >= 2) {
                out.collect(new Tuple2<>(input.f0, currentSum.f1));
            }
        }
    }
}

3.1提交job查看taskmanage的地址

2019-09-04 11:50:30,679 INFO  org.apache.flink.queryablestate.server.KvStateServerImpl      - Started Queryable State Server @ /192.168.0.3:9067.
2019-09-04 11:50:30,683 INFO  org.apache.flink.queryablestate.client.proxy.KvStateClientProxyImpl  - Started Queryable State Proxy Server @ /192.168.0.3:9069.

Started Queryable State Proxy Server @ /192.168.0.3:9069. 就是要查询的地址

4、客户端查询

4.1添加依赖包


  org.apache.flink
  flink-core
  1.9.0


  org.apache.flink
  flink-queryable-state-client-java
  1.9.0

4.2查询客户端代码

public class QueryState {
    public static void main(String[] args) throws UnknownHostException, InterruptedException {
        QueryableStateClient client = new QueryableStateClient(
                "192.168.0.3", // taskmanager的地址
                9069);// 默认是9069端口,可以在flink-conf.yaml文件中配置

        // the state descriptor of the state to be fetched.
        ValueStateDescriptor> descriptor =
                new ValueStateDescriptor<>(
                        "average",
                        TypeInformation.of(new TypeHint>() {}));

        while (true) {

            CompletableFuture>> resultFuture =
                    client.getKvState(
                            JobID.fromHexString("ccf23e29590475f9d18422752d23c2fa"), // 从webui中获取JID
                            "query123", // wordcount中设置的名字 descriptor.setQueryable("query123");
                            1L, // key的值
                            BasicTypeInfo.LONG_TYPE_INFO, // key 的类型
                            descriptor);

            // now handle the returned value
            resultFuture.thenAccept(response -> {
                try {
                    Tuple2 res = response.value();
                    System.out.println("res: " + res);
                } catch (Exception e) {
                    e.printStackTrace();
                }
            });
            
            Thread.sleep(1000);
        }
    }
}

 

你可能感兴趣的:(flink)