【spark on k8s】部署问题

部署spark任务在Kubernetes上报错

使用官方案例部署失败

`21/07/14 02:19:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/07/14 02:19:15 WARN DependencyUtils: Local jar /opt/spark/examples/jars/spark-examples_2.12-3.1.1.jar does not exist, skipping.

**Error: Failed to load class org.apache.spark.examples.SparkPi.**

21/07/14 02:19:15 INFO ShutdownHookManager: Shutdown hook called
21/07/14 02:19:15 INFO ShutdownHookManager: Deleting directory /tmp/spark-531eef47-e7c2-4632-9232-e45f68ac0396
`

【spark on k8s】部署问题_第1张图片

将容器中的文件拷贝出进行查看文件路径

在这里插入图片描述

官方yaml的路径
【spark on k8s】部署问题_第2张图片
对比jar名称错误,并且路径错误

将mainApplicationFile:改为/opt/spark/examples/jars/spark-examples_2.12-3.0.0.jar即可
【spark on k8s】部署问题_第3张图片

你可能感兴趣的:(spark,spark,k8s)