问题
I am running a django
application using docker
, and using python
logging in django
settings to write api logs inside a logs folder. When I restart my container my log files are also removed (which is understandable).
I would like to ship my logs (e.g. /path/to/workdir/logs/django.log
) to elasticsearch
. I am confused since my searches tell me to ship this path /var/lib/docker/containers/*/*.log
but I don't think this is what I want.
Any ideas on how I ship my logs inside the container to ELK Stack?
回答1:
You can ship logs from docker
containers stdout
/ stderr
to elasticsearch
using the gelf logging driver.
Configure the services to with the gelf
logging driver (docker-compose.yml
):
version: '3.7'
x-logging:
&logstash
options:
gelf-address: "udp://localhost:12201"
driver: gelf
services:
nginx:
image: 'nginx:1.17.3'
hostname: 'nginx'
domainname: 'example.com'
depends_on:
- 'logstash'
ports:
- '80:80'
volumes:
- '${PWD}/nginx/nginx.conf:/etc/nginx/nginx.conf:ro'
logging: *logstash
elasticsearch:
image: 'elasticsearch:7.1.1'
environment:
- 'discovery.type=single-node'
volumes:
- 'elasticsearch:/usr/share/elasticsearch/data'
expose:
- '9200'
- '9300'
kibana:
image: 'kibana:7.1.1'
depends_on:
- 'elasticsearch'
ports:
- '5601:5601'
volumes:
- '${PWD}/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml'
logstash:
build: 'logstash'
depends_on:
- 'elasticsearch'
volumes:
- 'logstash:/usr/share/logstash/data'
ports:
- '12201:12201/udp'
- '10514:10514/udp'
volumes:
elasticsearch:
logstash:
Note: the above example configures the logging using extension fields.
The minimal nginx.conf used for this example:
user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log debug;
pid /var/run/nginx.pid;
events {
worker_connections 1024;
}
http {
server {
listen 80;
server_name _;
location / {
return 200 'OK';
}
}
}
The logstash
image is a custom build using the below Dockerfile
:
FROM logstash:7.1.1
USER 0
COPY pipeline/gelf.cfg /usr/share/logstash/pipeline
COPY pipeline/pipelines.yml /usr/share/logstash/config
COPY settings/logstash.yml /usr/share/logstash/config
COPY patterns /usr/share/logstash/patterns
RUN rm /usr/share/logstash/pipeline/logstash.conf
RUN chown -R 1000:0 /usr/share/logstash/pipeline /usr/share/logstash/patterns /usr/share/logstash/config
USER 1000
... the relevant logstash
gelf plugin config:
input {
gelf {
type => docker
port => 12201
}
}
filter { }
output {
if [type] == "docker" {
elasticsearch { hosts => ["elasticsearch:9200"] }
stdout { codec => rubydebug }
}
}
... and pipelines.yml
:
- pipeline.id: "gelf"
path.config: "/usr/share/logstash/pipeline/gelf.cfg"
... and logstash.yml
to persist the data:
queue:
type: persisted
drain: true
The process running in the container logs to stdout
/ stderr
, docker
pushes the logs to logstash
using the gelf
logging driver (note: the logstash
address is localhost
because the docker
service discovery is not available to resolve the service name - the ports must be mapped to the host and the logging driver must be configured using localhost
) which outputs the logs to elasticsearch
that you can index in kibana
:
来源:https://stackoverflow.com/questions/58545087/docker-ship-log-files-being-written-inside-containers-to-elk-stack