Storm的版本选取
我这里,是选用apache-storm-0.9.6.tar.gz
Storm的本地模式安装
本地模式在一个进程里面模拟一个storm集群的所有功能, 这对开发和测试来说非常方便。以本地模式运行topology跟在集群上运行topology类似。
要创建一个进程内“集群”,使用LocalCluster对象就可以了:
import backtype.storm.LocalCluster; LocalCluster cluster = new LocalCluster();
然后可以通过LocalCluster对象的submitTopology方法来提交topology, 效果和StormSubmitter对应的方法是一样的。submitTopology方法需要三个参数: topology的名字, topology的配置以及topology对象本身。你可以通过killTopology方法来终止一个topology, 它需要一个topology名字作为参数。
要关闭一个本地集群,简单调用:
cluster.shutdown();
就可以了。
Storm的分布式模式安装(本博文)
官方安装文档
http://storm.apache.org/releases/current/Setting-up-a-Storm-cluster.html
机器情况:在master、slave1、slave2机器的/home/hadoop/app目录下分别下载storm安装包
1、apache-storm-0.9.6.tar.gz的下载
http://archive.apache.org/dist/storm/apache-storm-0.9.6/
或者,直接在安装目录下,在线下载
wget http://apache.fayea.com/storm/apache-storm-0.9.6/apache-storm-0.9.6.tar.gz
我这里,选择先下载好,再上传安装的方式。
2、上传压缩包
[hadoop@master ~]$ cd app/ [hadoop@master app]$ ll total 60 drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0 drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/ lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0 lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0 lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19 drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19 lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0 drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2 drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3 drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6 drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6 lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6 [hadoop@master app]$ rz [hadoop@master app]$ ll total 20580 -rw-r--r-- 1 hadoop hadoop 21010966 May 18 12:16 apache-storm-0.9.6.tar.gz drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0 drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/ lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0 lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0 lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19 drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19 lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0 drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2 drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3 drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6 drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6 lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6 [hadoop@master app]$
slave1和slave2机器同样。不多赘述。
3、解压压缩包,并赋予用户组和用户权限
[hadoop@master app]$ ll total 20580 -rw-r--r-- 1 hadoop hadoop 21010966 May 18 12:16 apache-storm-0.9.6.tar.gz drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0 drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/ lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0 lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0 lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19 drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19 lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0 drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2 drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3 drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6 drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6 lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6 [hadoop@master app]$ tar -zxvf apache-storm-0.9.6.tar.gz
slave1和slave2机器同样。不多赘述。
4、删除压缩包,为了更好容下多版本,创建软链接
大数据各子项目的环境搭建之建立与删除软连接(博主推荐)
[hadoop@master app]$ ll total 20584 drwxrwxr-x 9 hadoop hadoop 4096 May 21 13:15 apache-storm-0.9.6 -rw-r--r-- 1 hadoop hadoop 21010966 May 18 12:16 apache-storm-0.9.6.tar.gz drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0 drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3 lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/ lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0 drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0 lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0 drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0 lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19 drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19 lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0 drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79 drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60 lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2 drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2 lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/ drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64 lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3 drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3 lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6 drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6
lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6
[hadoop@master app]$ rm apache-storm-0.9.6.tar.gz
[hadoop@master app]$ ln -s apache-storm-0.9.6/ storm
[hadoop@master app]$ ll
total 64
drwxrwxr-x 9 hadoop hadoop 4096 May 21 13:15 apache-storm-0.9.6
drwxrwxr-x 5 hadoop hadoop 4096 May 1 15:21 azkaban
drwxrwxr-x 7 hadoop hadoop 4096 Apr 21 15:43 elasticsearch-2.4.0
drwxrwxr-x 6 hadoop hadoop 4096 Apr 21 12:12 elasticsearch-2.4.3
lrwxrwxrwx 1 hadoop hadoop 20 Apr 21 15:00 es -> elasticsearch-2.4.0/
lrwxrwxrwx 1 hadoop hadoop 11 Apr 20 12:19 flume -> flume-1.6.0
drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:17 flume-1.6.0
drwxrwxr-x 7 hadoop hadoop 4096 Apr 20 12:00 flume-1.7.0
lrwxrwxrwx. 1 hadoop hadoop 12 Apr 12 11:27 hadoop -> hadoop-2.6.0
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 16:33 hadoop-2.6.0
lrwxrwxrwx. 1 hadoop hadoop 13 Apr 12 11:28 hbase -> hbase-0.98.19
drwxrwxr-x. 8 hadoop hadoop 4096 Apr 12 17:27 hbase-0.98.19
lrwxrwxrwx. 1 hadoop hadoop 10 Apr 12 11:28 hive -> hive-1.0.0
drwxrwxr-x. 8 hadoop hadoop 4096 May 14 14:08 hive-1.0.0
lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 10:18 jdk -> jdk1.7.0_79
drwxr-xr-x. 8 hadoop hadoop 4096 Apr 11 2015 jdk1.7.0_79
drwxr-xr-x. 8 hadoop hadoop 4096 Aug 5 2015 jdk1.8.0_60
lrwxrwxrwx 1 hadoop hadoop 18 May 3 21:41 kafka -> kafka_2.11-0.8.2.2
drwxr-xr-x 6 hadoop hadoop 4096 May 3 22:01 kafka_2.11-0.8.2.2
lrwxrwxrwx 1 hadoop hadoop 26 Apr 21 22:18 kibana -> kibana-4.6.3-linux-x86_64/
drwxrwxr-x 11 hadoop hadoop 4096 Nov 4 2016 kibana-4.6.3-linux-x86_64
lrwxrwxrwx 1 hadoop hadoop 12 May 1 19:35 snappy -> snappy-1.1.3
drwxr-xr-x 6 hadoop hadoop 4096 May 1 19:40 snappy-1.1.3
lrwxrwxrwx. 1 hadoop hadoop 11 Apr 12 11:28 sqoop -> sqoop-1.4.6
drwxr-xr-x. 9 hadoop hadoop 4096 May 19 10:31 sqoop-1.4.6
lrwxrwxrwx 1 hadoop hadoop 19 May 21 13:17 storm -> apache-storm-0.9.6/
lrwxrwxrwx. 1 hadoop hadoop 15 Apr 12 11:28 zookeeper -> zookeeper-3.4.6
drwxr-xr-x. 10 hadoop hadoop 4096 Apr 12 17:13 zookeeper-3.4.6
[hadoop@master app]$
slave1和slave2机器同样。不多赘述。
5、修改配置环境
[hadoop@master app]$ su root
Password:
[root@master app]# vim /etc/profile
slave1和slave2机器同样。不多赘述
#storm export STORM_HOME=/home/hadoop/app/storm export PATH=$PATH:$STORM_HOME/bin
slave1和slave2机器同样。不多赘述
[hadoop@master app]$ su root Password: [root@master app]# vim /etc/profile [root@master app]# source /etc/profile [root@master app]#
slave1和slave2机器同样。不多赘述
6、下载好Storm集群所需的其他
因为博主我的机器是CentOS6.5,已经自带了
[hadoop@master ~]$ python Python 2.6.6 (r266:84292, Nov 22 2013, 12:16:22) [GCC 4.4.7 20120313 (Red Hat 4.4.7-4)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>>
7、配置storm的配置文件
[hadoop@master storm]$ pwd /home/hadoop/app/storm [hadoop@master storm]$ ll total 124 drwxrwxr-x 2 hadoop hadoop 4096 May 21 13:15 bin -rw-r--r-- 1 hadoop hadoop 42516 Oct 29 2015 CHANGELOG.md drwxrwxr-x 2 hadoop hadoop 4096 May 21 13:15 conf -rw-r--r-- 1 hadoop hadoop 538 Oct 29 2015 DISCLAIMER drwxr-xr-x 3 hadoop hadoop 4096 Oct 29 2015 examples drwxrwxr-x 5 hadoop hadoop 4096 May 21 13:15 external drwxrwxr-x 2 hadoop hadoop 4096 May 21 13:15 lib -rw-r--r-- 1 hadoop hadoop 23004 Oct 29 2015 LICENSE drwxrwxr-x 2 hadoop hadoop 4096 May 21 13:15 logback -rw-r--r-- 1 hadoop hadoop 981 Oct 29 2015 NOTICE drwxrwxr-x 6 hadoop hadoop 4096 May 21 13:15 public -rw-r--r-- 1 hadoop hadoop 10987 Oct 29 2015 README.markdown -rw-r--r-- 1 hadoop hadoop 6 Oct 29 2015 RELEASE -rw-r--r-- 1 hadoop hadoop 3581 Oct 29 2015 SECURITY.md [hadoop@master storm]$
slave1和slave2机器同样。不多赘述
进入storm配置目录下,修改配置文件storm.yaml
[hadoop@master conf]$ pwd /home/hadoop/app/storm/conf [hadoop@master conf]$ ll total 8 -rw-r--r-- 1 hadoop hadoop 1128 Oct 29 2015 storm_env.ini -rw-r--r-- 1 hadoop hadoop 1613 Oct 29 2015 storm.yaml [hadoop@master conf]$ vim storm.yaml
slave1和slave2机器同样。不多赘述
这里,教给大家一个非常好的技巧。
大数据搭建各个子项目时配置文件技巧(适合CentOS和Ubuntu系统)(博主推荐)
注意第一列需要一个空格
storm.zookeeper.servers: - "master" - "slave1" - "slave2" nimbus.host: "master" ui.port: 9999 storm.local.dir: "/home/hadoop/data/storm" supervisor.slots.ports: - 6700 - 6701
注意:我的这里ui.port选定为9999,是自定义,为了解决Storm 和spark默认的 8080 端口冲突!
supervisor.slots.ports,我这里是两个,因为我只有slave和slave2.
slave1和slave2机器同样。不多赘述。
8、新建storm数据存储的路径目录
[hadoop@master conf]$ mkdir -p /home/hadoop/data/storm
slave1和slave2机器同样。不多赘述
9、启动storm集群
1、先在master上启动
storm nimbus &
jps出现nimbus
2、再在master上启动
storm ui &
jps出现core
3、最后在slave1和slave2上启动 supervisor
storm supervisor &
jps出现supervisor
或者直接用后台方式来运行(推荐)
- 启动nimbus后台运行:bin/storm nimbus < /dev/null 2<&1 &
- 启动supervisor后台运行:bin/storm supervisor < /dev/null 2<&1 &
- 启动ui后台运行:bin/storm ui < /dev/null 2<&1 &
a) 在nimbus设备(我这里是master)上启动storm nimbus进程
[hadoop@master storm]$ jps 2374 QuorumPeerMain 6244 Jps 3343 AzkabanWebServer 2813 ResourceManager 3401 AzkabanExecutorServer 2515 NameNode 2671 SecondaryNameNode [hadoop@master storm]$ storm nimbus & [1] 6254 [hadoop@master storm]$ jps 2374 QuorumPeerMain 3343 AzkabanWebServer 2813 ResourceManager 3401 AzkabanExecutorServer 6255 config_value 2515 NameNode 2671 SecondaryNameNode 6265 Jps [hadoop@master storm]$ jps 2374 QuorumPeerMain 3343 AzkabanWebServer 6286 Jps 2813 ResourceManager 3401 AzkabanExecutorServer 6276 config_value 2515 NameNode 2671 SecondaryNameNode [hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9.6 -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9.6/logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9.6/lib/tools.macro-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/objen esis-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-core-1.1.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-fileupload-1.2.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clojure-1.5.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/hiccup-0.3.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-core-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.logging-0.2.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-stacktrace-0.2.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-classic-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.cli-0.2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-logging-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/disruptor-2.10.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-util-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/compojure-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-devel-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9.6/lib/carbonite-1.4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/chill-java-0.3.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-servlet-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/storm-core-0.9.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-time-0.4.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/core.incubator-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clout-1.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jgrapht-core-0.9.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/slf4j-api-1.7.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.6/conf -Xmx1024m -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9.6/logback/cluster.xml backtype.storm.daemon.nimbus
[hadoop@master storm]$ jps 2374 QuorumPeerMain 6244 Jps 3343 AzkabanWebServer 2813 ResourceManager 3401 AzkabanExecutorServer 2515 NameNode 2671 SecondaryNameNode [hadoop@master storm]$ storm nimbus & [1] 6254 [hadoop@master storm]$ jps 2374 QuorumPeerMain 3343 AzkabanWebServer 2813 ResourceManager 3401 AzkabanExecutorServer 6255 config_value 2515 NameNode 2671 SecondaryNameNode 6265 Jps [hadoop@master storm]$ jps 2374 QuorumPeerMain 3343 AzkabanWebServer 6286 Jps 2813 ResourceManager 3401 AzkabanExecutorServer 6276 config_value 2515 NameNode 2671 SecondaryNameNode [hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9.6 -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9.6/logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9.6/lib/tools.macro-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/objen esis-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-core-1.1.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-fileupload-1.2.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clojure-1.5.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/hiccup-0.3.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-core-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.logging-0.2.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-stacktrace-0.2.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-classic-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.cli-0.2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-logging-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/disruptor-2.10.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-util-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/compojure-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-devel-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9.6/lib/carbonite-1.4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/chill-java-0.3.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-servlet-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/storm-core-0.9.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-time-0.4.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/core.incubator-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clout-1.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jgrapht-core-0.9.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/slf4j-api-1.7.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.6/conf -Xmx1024m -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9.6/logback/cluster.xml backtype.storm.daemon.nimbus
b) 在nimbus设备(我这里是master)上启动storm ui进程
[hadoop@master storm]$ storm ui & [1] 6356 [hadoop@master storm]$ jps 2374 QuorumPeerMain 6367 Jps 3343 AzkabanWebServer 2813 ResourceManager 3401 AzkabanExecutorServer 6357 config_value 2515 NameNode 2671 SecondaryNameNode [hadoop@master storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9.6 -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9.6/logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9.6/lib/tools.macro-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-core-1.1.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-fileupload-1.2.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clojure-1.5.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/hiccup-0.3.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-core-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.logging-0.2.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-stacktrace-0.2.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-classic-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.cli-0.2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-logging-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/disruptor-2.10.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-util-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/compojure-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-devel-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9.6/lib/carbonite-1.4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/chill-java-0.3.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-servlet-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/storm-core-0.9.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-time-0.4.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/core.incubator-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clout-1.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jgrapht-core-0.9.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/slf4j-api-1.7.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.6:/home/hadoop/app/apache-storm-0.9.6/conf -Xmx768m -Dlogfile.name=ui.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9.6/logback/cluster.xml backtype.storm.ui.core
c) 在slave1和slave2设备上分别启动storm supervisor进程
storm supervisor &
[hadoop@slave1 storm]$ jps 2421 NodeManager 2342 DataNode 2274 QuorumPeerMain 4126 Jps [hadoop@slave1 storm]$ storm supervisor & [1] 4136 [hadoop@slave1 storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9.6 -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9.6/logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9.6/lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-core-1.1.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-devel-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/carbonite-1.4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/slf4j-api-1.7.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-logging-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.logging-0.2.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/disruptor-2.10.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clout-1.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/chill-java-0.3.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clojure-1.5.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-stacktrace-0.2.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jgrapht-core-0.9.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-classic-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-util-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-core-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-time-0.4.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/hiccup-0.3.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/core.incubator-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-servlet-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.macro-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-fileupload-1.2.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/storm-core-0.9.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/compojure-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.cli-0.2.4.jar:/home/hadoop/app/apache-storm-0.9.6/conf -Xmx256m -Dlogfile.name=supervisor.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9.6/logback/cluster.xml backtype.storm.daemon.supervisor
[hadoop@slave2 storm]$ jps 2365 NodeManager 2291 DataNode 4078 Jps 2229 QuorumPeerMain [hadoop@slave2 storm]$ storm supervisor & [1] 4089 [hadoop@slave2 storm]$ Running: /home/hadoop/app/jdk/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/app/apache-storm-0.9.6 -Dstorm.log.dir=/home/hadoop/app/apache-storm-0.9.6/logs -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/app/apache-storm-0.9.6/lib/ring-core-1.1.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/snakeyaml-1.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/objenesis-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/asm-4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-devel-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/servlet-api-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jline-2.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/core.incubator-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/reflectasm-1.07-shaded.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-codec-1.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clout-1.0.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/joda-time-2.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/disruptor-2.10.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-util-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/hiccup-0.3.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.macro-0.1.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jgrapht-core-0.9.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/storm-core-0.9.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.cli-0.2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-stacktrace-0.2.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/jetty-6.1.26.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-lang-2.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-exec-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clojure-1.5.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/json-simple-1.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/tools.logging-0.2.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/chill-java-0.3.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/slf4j-api-1.7.5.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-logging-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-core-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/minlog-1.2.jar:/home/hadoop/app/apache-storm-0.9.6/lib/clj-time-0.4.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/ring-servlet-0.3.11.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-fileupload-1.2.1.jar:/home/hadoop/app/apache-storm-0.9.6/lib/logback-classic-1.0.13.jar:/home/hadoop/app/apache-storm-0.9.6/lib/carbonite-1.4.0.jar:/home/hadoop/app/apache-storm-0.9.6/lib/commons-io-2.4.jar:/home/hadoop/app/apache-storm-0.9.6/lib/compojure-1.1.3.jar:/home/hadoop/app/apache-storm-0.9.6/lib/kryo-2.21.jar:/home/hadoop/app/apache-storm-0.9.6/conf -Xmx256m -Dlogfile.name=supervisor.log -Dlogback.configurationFile=/home/hadoop/app/apache-storm-0.9.6/logback/cluster.xml backtype.storm.daemon.supervisor
10、查看storm u集群
http://192.168.80.145:9999/index.html
成功!,其他的大家,自行去看吧,这里不多赘述了。
本文转自大数据躺过的坑博客园博客,原文链接:http://www.cnblogs.com/zlslch/p/6884613.html,如需转载请自行联系原作者