日志收集到Elasticsearch的几种方式

1,075 阅读2分钟

持续创作,加速成长!这是我参与「掘金日新计划 · 6 月更文挑战」的第28天,点击查看活动详情

日志收集到Elasticsearch的几种方式

前言

企业环境中, 常常需要将日志汇总到一起, 方便做日志分析, 一般是汇总到Elasticsearch, 通过Kibana可视化分析

下面介绍几种日志同步到Elasticsearch的方式

  • 环境
    • SpringBoot 2.3.2.RELEASE
    • Elasticsearch 7.6.2

Elasticsearch&Logback整合配置将日志直接同步到ES

默认是root日志, 如果不想使用root, 可以自定义logger, 然后通过name获取指定logger: private Logger logger = LoggerFactory.getLogger("es-logger");

maven

<dependency>
    <groupId>com.internetitem</groupId>
    <artifactId>logback-elasticsearch-appender</artifactId>
    <version>1.6</version>
</dependency>

logback-spring.xml

logback直接同步到es配置.jpg

logback TCP长连接方式同步到Logstash, Logstash再同步到elasticsearch

maven

  <dependency>
       <groupId>net.logstash.logback</groupId>
       <artifactId>logstash-logback-encoder</artifactId>
       <version>5.3</version>
   </dependency>

logback-spring.xml

tcp方式同步到logstash.jpg

logstash.conf

input {
  tcp {
    port => 4560
    codec => json_lines
  }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "logstash-%{+YYYY.MM.dd}"
  }
}

logback写入日志文件, logstash监听日志文件同步到elasticsearch

logstash.conf

input {
  file {
    type => "book-logs-info"
    path => "/data/logs/book_logs/log-info**.log"
  }
}
input {
  file {
    type => "book-logs-error"
    path => "/data/logs/book_logs/log-error**.log"
  }
}
filter {
    if [type] == "book-logs-error" {
        # 多行日志合并, 以[开头的算一行日志
        multiline {
	        pattern => '^\['
  	        negate => true
  	        what => "previous"
        }
    }
    mutate {
        # 日志信息按@@进行分割
        split => ["message","@@"]
        # 分割后的值与es字段映射
        add_field => {"level" => "%{[message][1]}"}
        add_field => {"thread" => "%{[message][2]}"}
        add_field => {"class" => "%{[message][3]}"}
        add_field => {"msg" => "%{[message][4]}"}
        # 排除冗余字段
        remove_field => "message"
        remove_field => "@version"
        remove_field => "tags"
    }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "logstash-book-file-%{+YYYY-MM}"
  }
}

logback同步到rabbitmqran, logstash监听mq消息同步到elasticsearch

maven

 <dependency>
     <groupId>org.springframework.amqp</groupId>
     <artifactId>spring-rabbit</artifactId>
 </dependency>
 <dependency>
     <groupId>org.springframework.boot</groupId>
     <artifactId>spring-boot-starter-amqp</artifactId>
 </dependency>
 <dependency>
     <groupId>net.logstash.logback</groupId>
     <artifactId>logstash-logback-encoder</artifactId>
     <version>6.0</version>
 </dependency>

logback-spring.xml

将日志发送到mq.png

logstash.conf

input {
  rabbitmq{
    host => "localhost"
    port => 5672
    durable => true
    user => "guest"
    password => "guest"
    queue => "lostashTestQueue"
	}
}
output{
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "logstash-test"
    }
 }

日志定期清理

shell脚本方式清理索引

#!/bin/sh
# 定期清理es索引
# es地址
addr=
# 索引前缀
indexPre=logstash
# 一个月前
# lastMonth=$(date -d"1 month ago" +"%Y.%m.%d")
# 一周前
lastWeek=$(date -d"1 week ago" +"%Y.%m.%d")
curl -XDELETE $addr/$indexPre-$lastWeek
# 服务名称, 多个以空格分隔
# appNames="app1 app2"
# 删除
# for app in $appNames; do
#    curl -XDELETE $addr/$indexPre-$lastWeek-[$app]
# done

Index Lifecycle Policies

  • Kibana上创建索引模板 Management => Index Management => Index Templates

创建索引模板.gif

  • Kibana上创建索引生命周期 Management => Index Lifecycle Policies => Create policy => 设置删除策略 => 绑定索引模板

创建索引生命周期.gif