flink连接到外部系统Kafka Connector API 代码报错问题org.apache.flink.table.factories.DeserializationSchemaFactory

 

更多内容参考:
https://ci.apache.org/projects/flink/flink-docs-release-1.7/dev/table/connect.html#kafka-connector

 

package org.apache.flink.streaming.scala.examples.kafka

import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.table.api.{TableEnvironment, Types}
import org.apache.flink.table.descriptors.{Json, Kafka, Rowtime, Schema}
import org.apache.flink.types.Row
import org.apache.flink.streaming.api.scala._

object KafkaJsonConnector {

  def main(args: Array[String]): Unit = {
    val env = StreamExecutionEnvironment.getExecutionEnvironment
    // create a TableEnvironment for streaming queries
    val tableEnv = TableEnvironment.getTableEnvironment(env)

    tableEnv
      .connect(
        new Kafka()
          .version("0.10")
          .topic("SM_USER_PROFILE")
          .startFromEarliest()
          .property("zookeeper.connect", "localhost:2181")
          .property("bootstrap.servers", "localhost:9092"))
      .withFormat(
        new Json()
          .deriveSchema()
      )
      .withSchema(
        new Schema()
          .field("COD_USERNO","string")
          .field("COD_USER_ID","string")
      )
      .inAppendMode()
      .registerTableSource("sm_user")

    val stream = tableEnv.scan("sm_user")
    tableEnv.toAppendStream[Row](stream).print().setParallelism(1)
    env.execute("example")
  }

}

问题来了,在集群上运行报错:


Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.DeserializationSchemaFactory' in
the classpath.

Reason: No factory implements 'org.apache.flink.table.factories.DeserializationSchemaFactory'.

解决:把官网上的三个包下载下来

flink连接到外部系统Kafka Connector API 代码报错问题org.apache.flink.table.factories.DeserializationSchemaFactory_第1张图片

 

把现在下来的包添加到集群/flink-1.7.2/lib/  的目录中,再把opt中的flink-table_2.11-1.7.2.jar移动到这个lib目录中,问题解决。

java.lang.NoClassDefFoundError: org/apache/flink/table/descriptors/ConnectorDescriptor
        at example.tech.streaming.StreamingKafkaData.main(StreamingKafkaData.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
        at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
        at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
        at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
        at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
        at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
        at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.table.descriptors.ConnectorDescriptor
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

 

如果报上面的错误,同上面一样添加包到lib目录中

 

 

 

 


 

你可能感兴趣的:(flink连接到外部系统Kafka Connector API 代码报错问题org.apache.flink.table.factories.DeserializationSchemaFactory)