flink问题集锦

报错一:

Could not get job jar and dependencies from JAR file: JAR file does not exist: -yn

原因:flink1.8版本之后已弃用该参数,ResourceManager将自动启动所需的尽可能多的容器,以满足作业请求的并行性。解决方法:去掉即可

报错二:

java.lang.IllegalStateException: No Executor found. Please make sure to export the HADOOP_CLASSPATH environment variable or have hadoop in your classpath.

方法1:

配置环境变量

export HADOOP_CLASSPATH=`hadoop classpath`
方法2:

下载对应版本flink-shaded-hadoop-2-uber,放到flink的lib目录下

报错三:

Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project book-stream: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)

在这里插入图片描述

产生这个问题的原因有很多,重要的是查看error报错的信息,我这边主要是scala中调用了java的方法,但build时只指定了打包scala的资源,所以会找不到类报错,下面是build出错的行,把它注释掉、删掉,不指定sourceDirectory,所有的sourceDirectory都会打包进去就可解决。

src/main/scala
报错四:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Could not find a suitable table factory for ‘org.apache.flink.table.planner.delegation.ParserFactory’ in
the classpath.

这个错误也是因为打包时候没有将依赖打包进去、或者需要将依赖放到flink的lib目录下

maven换成了如下的build 的pulgin

 
        
 
            
                org.scala-tools
                maven-scala-plugin
                2.15.2
                
                    
                        
                            compile
                            testCompile
                        
                    
                
            
 
            
                maven-compiler-plugin
                3.6.0
                
                    1.8
                    1.8
                
            
 
            
                org.apache.maven.plugins
                maven-surefire-plugin
                2.19
                
                    true
                
            
 
        
    
警告五:

Multiple versions of scala libraries detected!
Expected all dependencies to require Scala version: 2.11.12
org.apache.flink:flink-runtime_2.11:1.13.2 requires scala version: 2.11.12
org.apache.flink:flink-scala_2.11:1.13.2 requires scala version: 2.11.12
org.apache.flink:flink-scala_2.11:1.13.2 requires scala version: 2.11.12
org.scala-lang:scala-reflect:2.11.12 requires scala version: 2.11.12
org.apache.flink:flink-streaming-scala_2.11:1.13.2 requires scala version: 2.11.12
org.apache.flink:flink-streaming-scala_2.11:1.13.2 requires scala version: 2.11.12
org.scala-lang:scala-compiler:2.11.12 requires scala version: 2.11.12
org.scala-lang.modules:scala-xml_2.11:1.0.5 requires scala version: 2.11.7

这是由于scala-maven-plugin打包插件版本低的问题

Starting from Scala 2.10 all changes in bugfix/patch version should be backward compatible, so these warnings don’t really have the point in this case. But they are still very important in case when, let’s say, you somehow end up with scala 2.9 and 2.11 libraries. It happens that since version 3.1.6 you can fix this using scalaCompatVersion configuration

方法1:指定scalaCompatVersion一样的版本
 
        ${scala.binary.version}                  ${scala.version} 
 

下面是完整的

    
    net.alchim31.maven    
    scala-maven-plugin    
    3.1.6    
            
        ${scala.binary.version}                   ${scala.binary.version}    
        
            
                    
                            
                compile            
                    
            
    

方法2:打包插件换成4.x的版本
    
    net.alchim31.maven    
    scala-maven-plugin    
    4.2.0    
            
                    
                            
                compile            
                    
            
    

你可能感兴趣的:(java,后端,java,后端)