Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies”)出现错误

方式1:

错误1:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver

将mysql-connector-jdbc-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群(如果使用多个connector需要使用其他的maven打包插件 进入解决办法)。

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误_第1张图片

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突

错误2:

Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at

将flink-connector-hive-xxx.jar(对应版本)、hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误_第2张图片

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突

错误3:

Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

将hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误_第3张图片

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突

方式2:

错误1:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver

错误2:

Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at

错误3:

Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

使用Flink的远程提交方式,提交程序

pom问需要导入相关依赖

<dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
</dependency>
<dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-jdbc_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
</dependency>
<dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.44</version>
</dependency>
<dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>2.3.6</version>
</dependency>

提交任务-相关代码(跟jar中的代码有一点不同(环境不同),其他必须一致,否则无法运行

public static void main(String[] args) throws Exception {
		//任务jar中的环境(任务jar使用)
		/*StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();*/


		
        String mavenPath = "D:/apache-maven/maven-repository/";
        // 第一个jar是不带依赖的需要跑任务jar,后面的jar为任务jar需要的依赖
		//提交任务的环境(用于远程提交任务)
        StreamExecutionEnvironment env = StreamExecutionEnvironment.createRemoteEnvironment("192.168.88.108",
                8081,
                1,
                "Flink-Test/target/Flink-Test-1.0-SNAPSHOT.jar",
                mavenPath + "org/apache/flink/flink-connector-jdbc_2.11/1.12.1/flink-connector-jdbc_2.11-1.12.1.jar",
                mavenPath + "org/apache/flink/flink-connector-hive_2.11/1.12.1/flink-connector-hive_2.11-1.12.1.jar",
                mavenPath + "org/apache/hive/hive-exec/2.3.6/hive-exec-2.3.6.jar",
                mavenPath + "mysql/mysql-connector-java/5.1.44/mysql-connector-java-5.1.44.jar");
			
	EnvironmentSettings settings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
	StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env, settings);
	
	//此处做了大量代码省略(跟任务jar的代码一致)
	..........................................
	
	HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
    tableEnv.registerCatalog(name, hive);
    tableEnv.useCatalog(name);
    tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
    tableEnv.useDatabase("test");


	//此处做了大量代码省略(跟任务jar的代码一致)
	.....................................

	env.execute();
}
              

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误_第4张图片

其他

错误4:

Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at com.MysqlToHiveSql.main(MysqlToHiveSql.java:110) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

配置环境变量
# 配置在HADOOP_HOME的之后及export PATH=$PATH:$HADOOP_HOME/bin之后
export HADOOP_CLASSPATH=$(hadoop classpath)

实例
Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误_第5张图片

重启flink集群,解决

注意:使用的Maven打包插件

<build>
         <plugins>
             <plugin>
                 <groupId>org.apache.maven.plugins</groupId>
                 <artifactId>maven-assembly-plugin</artifactId>
                 <version>3.3.0</version>
                 <configuration>
                     <descriptorRefs>
                         <descriptorRef>jar-with-dependencies</descriptorRef>
                     </descriptorRefs>
                     <archive>
                         <manifest>
                             <mainClass>com.MysqlToHiveSqlProp</mainClass>
                         </manifest>
                     </archive>
                 </configuration>
                 <executions>
                     <execution>
                         <id>make-assembly</id>
                         <phase>package</phase>
                         <goals>
                             <goal>single</goal>
                         </goals>
                     </execution>
                 </executions>
             </plugin>
         </plugins>
</build>

你可能感兴趣的:(大数据,jar,flink,big,data)