正文
一、前提条件
1.es 7.6.2集群
2.安装 kibana 7.6.2 logstash 7.6.2
二、集成过程
注:针对elasticsearch的搜索,来进行elk分析
1.配置logstash.conf
input{ tcp { port => 5044 #暴露5044端口 codec => json_lines #编码格式 json字符串并换行 } } output{ elasticsearch{ #es配置 hosts=>["http://192.168.1.1:9200","http://192.168.1.2:9200","http://192.168.1.3:9200"] #集群地址 index => "laokou-%{+YYYY.MM.dd}" #索引名 } stdout{codec => rubydebug} #编码格式 rubydebug }
注:rubydebug编码格式,如下图所示
2.导入依赖
<!-- logback 推送日志文件到logstash --> <dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>5.1</version> </dependency>
3.配置logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/base.xml" /> <logger name="org.springframework.web" level="INFO"/> <logger name="org.springboot.sample" level="TRACE" /> <!-- 开发、测试环境 --> <springProfile name="dev,test"> <logger name="org.springframework.web" level="INFO"/> <logger name="org.springboot.sample" level="INFO" /> <logger name="io.laokou.elasticsearch" level="DEBUG" /> </springProfile> <!-- 生产环境 --> <springProfile name="prod"> <logger name="org.springframework.web" level="ERROR"/> <logger name="org.springboot.sample" level="ERROR" /> <logger name="io.laokou.elasticsearch" level="ERROR" /> </springProfile> <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>127.0.0.1:5044</destination> <!--logstash ip和暴露的端口,logback就是通过这个地址把日志发送给logstash--> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder" /> </appender> <root level="INFO"> <appender-ref ref="LOGSTASH" /> <appender-ref ref="CONSOLE" /> </root> </configuration>
4.启动es集群/kibana/logstash
5.启动项目
6.查看是否创建laokou-yyyy.mm.dd
7.进行es搜索(省略,根据业务条件搜索即可)
8.访问(浏览器输入 http://localhost:5601)