引言
在前面的博客中,我们知道在Linux下如何搭建ELK(ElasticSearch+Logstash+Kibana)以及它们的使用,如有兴趣的同学,可以参考下。
安装教程:
- 《分布式系列教程(27) -Linux环境下安装Elasticsearch》
- 《分布式系列教程(37) -Linux下搭建ElasticSearch集群》
- 《分布式系列教程(28) -Linux环境安装Kibana》
- 《分布式系列教程(40) -Linux下安装Logstash》
使用教程:
- 《分布式系列教程(29) -Kibana实现增删改查》
- 《分布式系列教程(32) -ElasticSearch条件查询》
- 《分布式系列教程(33) -ElasticSearch DSL语言查询与过滤》
- 《分布式系列教程(41) -Logtash的简单使用》
前面讲的都是ELK每个模块单独使用的,本文主要讲解三者是如何联合使用。
ELK配置使用流程
首先确保ELK已经成功的安装在Linux服务器上了。
1.配置Logstash,把ElasticSearch的日志POST到ElasticSearch服务器存储。在Logstash配置目录下新建文件myeslog.conf
cd /usr/local/logstash-6.4.3/config/ vi myeslog.conf
2.配置内容如下:
input { # 从文件读取日志信息 输送到控制台 file { path => "/usr/local/elasticsearch-6.4.3/logs/myes.log" codec => "json" ## 以JSON格式读取日志 type => "elasticsearch" start_position => "beginning" } } # filter { # # } output { # 标准输出 # stdout {} # 输出进行格式化,采用Ruby库来解析日志 stdout { codec => rubydebug } elasticsearch { hosts => ["192.168.162.131:9200"] index => "es-%{+YYYY.MM.dd}" } }
3.先切换至用户账号,启动ES:
su ylw /usr/local/elasticsearch-6.4.3/bin/elasticsearch
4.启动Logstash,这里前台启动,方便看控制台输出日志(-f
表示前台启动,启动有点慢):
cd /usr/local/logstash-6.4.3/bin/ ./logstash -f ../config/myeslog.conf
5.启动Kibana
cd /usr/local/kibana-6.4.3-linux-x86_64/bin/ ./kibana
6.浏览器输入:http://192.168.162.131:5601/,在控制台输入es,会弹出提示,显示获取今天打印的日志:
7.查询所有日志,可以看出把日志都查询出来了,总共55条:
GET /es-2019.12.19/_search
8.也可以模糊查询,查询message里面含有Ov3Qy5c
字符串的日志:
GET /es-2019.12.19/_search { "from": 0, "size": 3, "query": { "match": { "message": "Ov3Qy5c" } } }
返回:
{ "took": 17, "timed_out": false, "_shards": { "total": 5, "successful": 5, "skipped": 0, "failed": 0 }, "hits": { "total": 50, "max_score": 0.34676385, "hits": [ { "_index": "es-2019.12.19", "_type": "doc", "_id": "fYDDHW8BMgxCikmkzC2N", "_score": 0.34676385, "_source": { "@timestamp": "2019-12-19T10:46:04.605Z", "path": "/usr/local/elasticsearch-6.4.3/logs/myes.log", "type": "elasticsearch", "host": "localhost.localdomain", "message": "[2019-12-19T18:42:21,780][INFO ][o.e.p.PluginsService ] [Ov3Qy5c] loaded module [mapper-extras]", "@version": "1", "tags": [ "_jsonparsefailure" ] } }, { "_index": "es-2019.12.19", "_type": "doc", "_id": "foDDHW8BMgxCikmkzC2N", "_score": 0.34676385, "_source": { "@timestamp": "2019-12-19T10:46:04.607Z", "path": "/usr/local/elasticsearch-6.4.3/logs/myes.log", "type": "elasticsearch", "host": "localhost.localdomain", "message": "[2019-12-19T18:42:21,794][INFO ][o.e.p.PluginsService ] [Ov3Qy5c] loaded module [parent-join]", "@version": "1", "tags": [ "_jsonparsefailure" ] } }, { "_index": "es-2019.12.19", "_type": "doc", "_id": "kIDDHW8BMgxCikmkzC2N", "_score": 0.34676385, "_source": { "@timestamp": "2019-12-19T10:46:04.812Z", "path": "/usr/local/elasticsearch-6.4.3/logs/myes.log", "type": "elasticsearch", "host": "localhost.localdomain", "message": "[2019-12-19T18:42:21,796][INFO ][o.e.p.PluginsService ] [Ov3Qy5c] loaded plugin [analysis-ik]", "@version": "1", "tags": [ "_jsonparsefailure" ] } } ] } }
可以看出返回了两条数据!
9.也可以查询时间戳为2019-12-19T10:46:04.812Z
的日志
GET /es-2019.12.19/_search { "from": 0, "size": 3, "query": { "match": { "@timestamp": "2019-12-19T10:46:04.812Z" } } }
返回一条日志:
10.同时,也可以可视化查询