【Flink】Could not get job jar and dependencies from JAR file: JAR file does not exist:

问题背景:使用flink客户端执行flink提交到yarn上,输入参数-yjm等调整flink参数,执行后出现下面问题

/data/flink/flink-1.13.2/bin/flink run -yjm 4096m -ytm 4096m  -ynm test13  -yd -m yarn-cluster   -yqu test  -C https://files.huizecdn.com/file2/M00/00/01/rBMBfmGSRQqAAIQSAAAKEcKAWyg678.jar -c com.huize.beidou.realtime.streaming.core.JobApplication /home/it/flink/core/flink-streaming-core.jar -sql /home/it/flink/sql/job_sql_4.sql  -type 0
Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR or HADOOP_CLASSPATH was set.
Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
Could not get job jar and dependencies from JAR file: JAR file does not exist: -yjm

Could not get job jar and dependencies from JAR file: JAR file does not exist: -yjm

这是由于Flink on YARN 客户端通常需配置HADOOP_ CLASSPATH 环境变量来让客户端能加载到 Hadoop 配置和依赖 JAR 文件

解决方案,在flink脚本中添加环境变量,flink客户端bin目录如下

 
 

你可能感兴趣的:(Flink实战,flink,css,html)