ELK 收集日志并写入数据库
写入数据库的目录是用于持久化报错重要数据,比如状态码、客户端IP、客户端浏览器版本等待,用于后期按月做数据统计等。
web2(107)
安装数据库
apt install mysql-server mysql-client -y
修改mysql配置文件监听地址
vim /etc/mysql/mysql.conf.d/mysqld.cnf
bind-address = 0.0.0.0
重启mysql服务、并设为开机启动
systemctl restart mysql
systemctl enable mysql
# mysql
#创建
mysql> create database elk character set utf8 collate utf8_bin;
Query OK, 1 row affected (0.00 sec)
# 授权
mysql> grant all privileges on elk.* to elk@"%" identified by '123456';
Query OK, 0 rows affected, 1 warning (0.00 sec)
#刷新
mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)
#显示当前MySQL服务器上的所有数据库
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| elk |
| mysql |
| performance_schema |
| sys |
+--------------------+
5 rows in set (0.00 sec)
logstash(103)
官方下载地址:dev.mysql.com/downloads/c…
下载
wget https://cdn.mysql.com/archives/mysql-connector-java-8.0/mysql-connector-java_8.0.18-1ubuntu18.04_all.deb
安装
dpkg -i mysql-connector-java_8.0.18-1ubuntu18.04_all.deb
创建jar包目录
mkdir /usr/share/logstash/vendor/jar/jdbc -pv
拷贝文件到指定目录
cp /usr/share/java/mysql-connector-java-8.0.18.jar /usr/share/logstash/vendor/jar/jdbc/
修改目录属主属组权限
chown logstash.logstash /usr/share/logstash/vendor/ -R
安装插件
#当前已经安装的所有插件
/usr/share/logstash/bin/logstash-plugin list
# /usr/share/logstash/bin/logstash-plugin install logstash-output-jdbc
Validating logstash-output-jdbc
Installing logstash-output-jdbc
Installation successful <--成功
新建表
添加表信息、保存
logstash(103)
cd /etc/logstash/conf.d/
#添加74-77信息
cat redis-to-es.conf
input {
redis {
host => "192.168.37.104"
port => "6379"
password => "123456"
key => "syslog-37-106"
data_type => list
db => 3
}
redis {
host => "192.168.37.104"
port => "6379"
password => "123456"
key => "syslog-37-107"
data_type => list
db => 3
}
redis {
host => "192.168.37.104"
port => "6379"
password => "123456"
key => "nginx-accesslog-37-106"
data_type => list
db => 3
}
redis {
host => "192.168.37.104"
port => "6379"
password => "123456"
key => "nginx-accesslog-37-107"
data_type => list
db => 3
}
}
filter {
if [fields][app] == "nginx-106" {
geoip {
source => "clientip"
target => "geoip"
#文件路径
database => "/etc/logstash/GeoLite2-City_20191015/GeoLite2-City.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
}
output {
#系统日志
if [fields][type] == "syslog-106" {
elasticsearch {
hosts => ["http://192.168.37.102:9200"]
index => "filebeat-syslog-37-106-%{+YYYY.MM.dd}"
}}
if [fields][type] == "syslog-107" {
elasticsearch {
hosts => ["http://192.168.37.102:9200"]
index => "filebeat-syslog-37-107-%{+YYYY.MM.dd}"
}}
#nginx日志
if [fields][app] == "nginx-106" {
elasticsearch {
hosts => ["http://192.168.37.102:9200"]
index => "logstash-nginx-accesslog-37-106-%{+YYYY.MM.dd}"
}
jdbc {
connection_string => "jdbc:mysql://192.168.37.107/elk?user=elk&password=123456&useUnicode=true&characterEncoding=UTF8"
statement => ["INSERT INTO elklog(clientip,url,status,http_host) VALUES(?,?,?,?)", "clientip","url","status","http_host"]
}
}
if [fields][app] == "nginx-107" {
elasticsearch {
hosts => ["http://192.168.37.102:9200"]
index => "logstash-nginx-accesslog-37-107-%{+YYYY.MM.dd}"
}}
}
检查
/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/redis-to-es.conf -t
重启服务
systemctl restart logstash
web1(106)
生成一些数据
echo 111 >> /var/log/access.log
重启服务
systemctl restart filebeat
在www.kibana101.com/查看是否有数据
K8S 日志收集到 ELK
web发展历程及结课项目