Ambari安装hadoop遇到的问题(二)

5、启动spark,Spark Thrift Server启动失败,抛出异常

问题7:jvm启动失败 ,报-Xloggc配置有问题

通过ambrai页面,将启动的组件对应的config—>Advanced的env进行修改,去除掉-Xloggc配置。

6、连接hive失败

ExecutionFailed: Execution of '! beeline -u'jdbc:hive2://c5-ibmpower-sv011:10015/default'

7、时钟同步

添加定时任务

crontab -e

*/10 * * * * /usr/sbin/ntpdate 10.19.244.52

8、

The reported blocks 0 needs additional 42blocks to reach the threshold 0.9900 of total blocks 42.

The number of live datanodes 3 has reachedthe minimum number 0. Safe mode will be turned off automatically once thethresholds have been reached.

步骤1执行命令退出安全模式:

先切换到hdfs:su –hdfs

再执行:hdfs dfsadmin -safemode leave

步骤2执行健康检查,删除损坏掉的block。hdfs fsck  /  -delete

9、

Permission denied: user=root,access=READ_EXECUTE, inode="/app-logs/hive":hive:hadoop:drwxrwx---

Sudo –u hdfs hdfs fsck /

10、

org.apache.hadoop.hbase.util.FileSystemVersionException:HBase file layout needs to be upgraded. You have version null and I wantversion 8. Consult http://hbase.apache.org/book.html for further informationabout upgrading HBase. Is your hbase.rootdir valid? If so, you may need to run'hbase hbck -fixVersionFile'.

删除:apps/hbase/data目录下的数据,重启hbase

你可能感兴趣的:(Ambari安装hadoop遇到的问题(二))