CentOS 7 下 Storm-1.2.2 安装及示例

安装准备:

  1. 安装 JDK1.8
  2. 安装 ZooKeeper https://www.jianshu.com/p/f9a2defdd04f
  3. Strom-1.2.2 https://mirrors.tuna.tsinghua.edu.cn/apache/storm/apache-storm-1.2.2/apache-storm-1.2.2.tar.gz

1. 解压安装包

[hadoop@hadoop131 software]$ tar zxvf apache-storm-1.2.2.tar.gz -C ../bigdata/hadoop/
[hadoop@hadoop131 software]$ cd /opt/bigdata/hadoop/
[hadoop@hadoop131 hadoop]$ ls
apache-storm-1.2.2  hive-1.2.1    oozie-4.0.0-cdh5.3.6  sqoop-1.4.6
hadoop-2.7.3        hive_bak      scala-2.10.6          zookeeper-3.4.10
hbase-1.3.1         jdk1.8.0_191  spark-2.4.0           zookeeper.out
[hadoop@hadoop131 hadoop]$ mv apache-storm-1.2.2/ storm-1.2.2

2.配置环境变量

[hadoop@hadoop131 hadoop]$ sudo vim /etc/profile
[sudo] hadoop 的密码:
[hadoop@hadoop131 hadoop]$ source /etc/profile
#STORM_HOME
export STORM_HOME=/opt/bigdata/hadoop/storm-1.2.2
export PATH=$STORM_HOME/bin:$PATH

3.修改配置文件

[hadoop@hadoop131 hadoop]$ cd storm-1.2.2/
[hadoop@hadoop131 storm-1.2.2]$ cd conf
[hadoop@hadoop131 conf]$ ls
storm_env.ini  storm-env.ps1  storm-env.sh  storm.yaml
[hadoop@hadoop131 conf]$ vim storm.yaml 

不要用制表符代替空格, 不要忽略 短横杠 后面的空格

storm.zookeeper.servers:
    - "hadoop131"
    - "hadoop132"
    - "hadoop133"

nimbus.seeds: ["hadoop131"]
storm.local.dir: "/opt/bigdata/hadoop/storm-1.2.2/local"
supervisor.slots.ports:
      - 6700
      - 6701
      - 6702
      - 6703
[hadoop@hadoop131 conf]$ cd ..
[hadoop@hadoop131 storm-1.2.2]$ mkdir local

4. 分发配置到其他节点 (没有xsync用scp代替即可)

[hadoop@hadoop131 storm-1.2.2]$ cd ..
[hadoop@hadoop131 hadoop]$ xsync storm-1.2.2/
[hadoop@hadoop131 hadoop]$ su
[root@hadoop131 hadoop]# xsync /etc/profile
[root@hadoop131 hadoop]# exit

每个节点添加完后都执行:source /etc/profile

5.启动

a. 每个节点 启动zookeeper

[hadoop@hadoop131 software]$ /opt/bigdata/hadoop/zookeeper-3.4.10/bin/zkServer.sh start
[hadoop@hadoop132 software]$ /opt/bigdata/hadoop/zookeeper-3.4.10/bin/zkServer.sh start
[hadoop@hadoop133 software]$ /opt/bigdata/hadoop/zookeeper-3.4.10/bin/zkServer.sh start

b. 在 hadoop131 (Master) 上启动 Nimbus,ui

[hadoop@hadoop131 software]$ storm nimbus &
[hadoop@hadoop131 software]$ storm ui &

稍等片刻, 打开hadoop131:8080即可看到webui

image.png

c. 每个节点 启动supervisor

[hadoop@hadoop131 software]$ storm supervisor > /dev/null 2>&1 &
[hadoop@hadoop132 software]$ storm supervisor > /dev/null 2>&1 &
[hadoop@hadoop133 software]$ storm supervisor > /dev/null 2>&1 &

刷新webui ,有如下结果即启动成功


image.png

image.png

=======================================================
挨个启动太麻烦怎么办?分享一下群起群停的shell脚本
start_supervisor.sh

#!/bin/bash
. /etc/profile
#后台启动
nohup storm supervisor >/opt/bigdata/hadoop/storm-1.2.2/supervisor_log 2>&1 &

stop_supervisor.sh

#!/bin/bash 
kill -9 `ps -ef|grep daemon.supervisor|awk '{print $2}'`

storm_start_all.sh

#!/bin/bash

# chkconfig: 2345 10 90
# description: zookeeper start

stormhome=/opt/bigdata/hadoop/storm-1.2.2
stormbin=$stormhome/bin
shell_home=/usr/local/bin

nohup storm nimbus >$stormhome/nimbus.log 2>&1 &
nohup storm ui >$stormhome/ui.log 2>&1 &
sleep 5
jps

echo "start storm......"
for i in {1..3};do
   ssh hadoop13$i "$shell_home/start_supervisor.sh;hostname;sleep 5;jps;exit"
   echo "-----------------"
done
echo "**************************"
echo "start storm finished !"

storm_stop_all.sh

#!/bin/bash
. /etc/profile
# storm的bin目录

stormhome=/opt/bigdata/hadoop/storm-1.2.2
shell_home=/usr/local/bin

echo "stop storm cluster ........"
kill -9 `ps -ef|grep daemon.nimbus| awk '{print $2}'`
kill -9 `ps -ef|grep ui.core| awk '{print $2}'`
sleep 3
jps
echo "----------------------"

for i in {1..3};do
   ssh hadoop13$i "$shell_home/stop_supervisor.sh;jps;exit"
   echo "--------------------------"
done

echo "stop storm cluster finished !"
echo "******************************"

分发到每个节点,在Master节点启动

参考:

  • https://blog.csdn.net/SCGH_Fx/article/details/80602146
  • https://blog.csdn.net/liuqiyao_01/article/details/41542101
  • https://www.jianshu.com/p/151b422aaebb
  • https://www.cnblogs.com/sos-blue/p/6798810.html

=======================================================
解决 linux 下编译 make 文件报错 “/bin/bash^M: 坏的解释器:没有那个文件或目录” 问题
原因:linux与win换行差异

sed -i 's/\r$//' build.sh

会把 build.sh 中的 \ r 替换成空白!

=======================================================
shell bash: jps: 未找到命令
原因: jps非shell自带命令,在/usr/local/bin下创建jps的软连接
ln -s /opt/bigdata/hadoop/jdk1.8.0_191/bin/jps /usr/local/bin/jps

你可能感兴趣的:(CentOS 7 下 Storm-1.2.2 安装及示例)