spark on yarn idea错误: 找不到或无法加载主类org.apache.spark.deploy.yarn.ExecutorLauncher
sparkonyarnidea远程提交报错,查看8088日志发现设置sparkconf的"spark.yarn.jars"属性后解决第一个为你的jar包名称,第二个为spark依赖的jar包目录.set("spark.yarn.jars","C:\\Users\\han\\Desktop\\test\\dns_project\\target\\dns_project.jar,hdfs://dsy: