spark报错:apache.spark.memory.TaskMemoryManager - Failed to allocate a page (6710 bytes), try again.

报错信息和截图

17510 [Executor task launch worker for task 2.0 in stage 1.0 (TID 3)] WARN  org.apache.spark.memory.TaskMemoryManager  - Failed to allocate a page (67108864 bytes), try again.
17195 [Executor task launch worker for task 4.0 in stage 1.0 (TID 5)] WARN  org.apache.spark.memory.TaskMemoryManager  - Failed to allocate a page (67108864 bytes), try again.
17195 [Executor task launch worker for task 11.0 in stage 1.0 (TID 12)] WARN  org.apache.spark.memory.TaskMemoryManager  - Failed to allocate a page (67108864 bytes), try again.
16976 [Executor task launch worker for task 7.0 in stage 1.0 (TID 8)] WARN  org.apache.spark.memory.TaskMemoryManager  - Failed to allocate a page (67108864 bytes), try again.
 

spark报错:apache.spark.memory.TaskMemoryManager - Failed to allocate a page (6710 bytes), try again._第1张图片

报错原因:

 从打印的日志中可以很明显的看出是spark的memory内存不足的问题

解决方法:

   val conf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("eight")
     .set("spark_testing_memory", "2000000000000")

在创建Spark的配置项中添加内存容量!

你可能感兴趣的:(报错问题,spark,apache,大数据)