hadoop/spark环境搭建中的坑

1.ip映射问题

/etc/hosts文件中的ip映射关系要正确

2.datanode启动问题

namenode 的id和 datanode的id要对应
解答:http://dblab.xmu.edu.cn/blog/818-2/

3.spark启动启动失败问题

报错:
18/10/30 20:26:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/10/30 20:27:11 ERROR spark.SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File does not exist: hdfs://hadoop000:8020/directory

原因:File does not exist: hdfs://hadoop000:8020/directory 需要创建这个文件夹

解决:hadoop fs –mkdir /directory

你可能感兴趣的:(大数据)