dbus安装笔记

1、dbus-heartbeat-0.4.0/conf/stat_config.properties/中influxdb.url默认值是locahost,应该改成真实的influxdb.url

2、ttps://bridata.github.io/DBus/install-logstash-source.html中logstash验证命令bin/kafka-console-consumer.sh --zookeeper dbus-n1:2181,dbus-n2:2181,dbus-n3:2181/kafka --topic heartbeat_log_logstash错误

正确的应该是bin/kafka-console-consumer.sh --zookeeper localhost:2181/kafka --topic heartbeat_log  

3、如果dbus-heartbeat-0.4.0/logs/heartbeat/heartbeat.log出现org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /DBus/HeartBeat/Monitor/testlog/testlog_schema/t_dbus_heartbeat/bd-dev-framework-211 则说明bd-dev-framework-211节点在zookeeper未创建,可以通过web管理后台中的zk manage来创建该节点

4、由于web配置bug,填写的用户root会变成app用户,所以需要服务器上创建app帐号,并和root帐号打通免密登录


登录root 并ssh-keygen

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

scp .~/ssh/id_rsa.pub /home/app/.ssh/id_rsa.pub.root

登陆 app cat ~/.ssh/id_rsa.pub.root >> ~/.ssh/authorized_keys #如果authorized_keys 不存在使用touch 创建chmod 700 ~/.sshchmod 600 ~/.ssh/authorized_keys

重新在user 登陆user2,第一次需要输入密码 第二次不需要输入密码就可以直接登陆


5、执行strom任务会报以下错误2018-02-28 18:27:19:999 - error: /app/dbus-allinone/distribution-0.4.0-bin/manager/lib/service/start-topology-service.js[37] - startTopo err: Error: Command failed: ssh -p 22 root@localhost 'cd /app/dbus-allinone/apache-storm-1.0.1//dbus_jars; ./dbus_startTopology.sh /app/dbus-allinone/apache-storm-1.0.1/ log-processor localhost:2181 heartbeat_log 0.4.x/log_processor/20180123_201400/dbus-log-processor-0.4.0-jar-with-dependencies.jar'
Traceback (most recent call last):
  File "/app/dbus-allinone/apache-storm-1.0.1//bin/storm.py", line 766, in
    main()
  File "/app/dbus-allinone/apache-storm-1.0.1//bin/storm.py", line 763, in main
    (COMMANDS.get(COMMAND, unknown_command))(*ARGS)
  File "/app/dbus-allinone/apache-storm-1.0.1//bin/storm.py", line 234, in jar
    transform_class = confvalue("client.jartransformer.class", [CLUSTER_CONF_DIR])
  File "/app/dbus-allinone/apache-storm-1.0.1//bin/storm.py", line 144, in confvalue
    p = sub.Popen(command, stdout=sub.PIPE)
  File "/usr/lib64/python2.6/subprocess.py", line 642, in __init__
    errread, errwrite)
  File "/usr/lib64/python2.6/subprocess.py", line 1238, in _execute_child
    raise child_exception

OSError: [Errno 2] No such file or directory


错误原因是因为ssh -p 22 root@localhost没有把JAVA_HOME环境变量带过来

解决办法是在/apache-storm-1.0.1/bin/storm脚本最上面增加java环境变量export JAVA_HOME=/opt/programs/jdk1.8.0_152
export JRE_HOME=/opt/programs/jdk1.8.0_152/jre
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin

2、kafka-manager网上都是源码包,需要通过sbt打包


你可能感兴趣的:(大数据)