spark on yarn cluster模式,异常:no suitable driver

网上很多都说要配置–driver-class-path的,又说要把mysql驱动包放到spark默认的classpath下面

其实只需要配置–jars,然后设置下driver配置即可
spark on yarn cluster模式,异常:no suitable driver_第1张图片
然后执行脚本

spark2-submit \
--master yarn \
--deploy-mode cluster \
--class com.bigdata.PreWarningScalaAppV2 \
--jars /var/lib/hadoop-hdfs/converter-moshi-2.1.0.jar,/var/lib/hadoop-hdfs/fastjson-1.2.58.jar,/var/lib/hadoop-hdfs/guava-20.0.jar,/var/lib/hadoop-hdfs/influxdb-java-2.5.jar,file:/var/lib/hadoop-hdfs/kafka-clients-2.0.0.jar,file:/var/lib/hadoop-hdfs/logging-interceptor-3.5.0.jar,file:/var/lib/hadoop-hdfs/moshi-1.2.0.jar,file:/var/lib/hadoop-hdfs/okhttp-3.5.0.jar,file:/var/lib/hadoop-hdfs/okio-1.11.0.jar,file:/var/lib/hadoop-hdfs/retrofit-2.1.0.jar,file:/var/lib/hadoop-hdfs/spark-streaming-kafka-0-10_2.11-2.4.4.jar,file:/var/lib/hadoop-hdfs/mysql-connector-java-5.1.48.jar \
--conf "spark.driver.userClassPathFirst=true" \
/var/lib/hadoop-hdfs/prewarning-1.0.jar

然后结果就ok了

spark on yarn cluster模式,异常:no suitable driver_第2张图片
程序一直在跑,ok了
spark on yarn cluster模式,异常:no suitable driver_第3张图片

你可能感兴趣的:(CDH)