Streaming Cloudwatch Logs to Amazon ES

你说的曾经没有我的故事 提交于 2020-04-18 05:45:26

问题


I'm using Fargate to deploy my application. To log the container logs, I'm using awslogs as the log-driver. Now I want to ship my logs to Amazon ES service. While going through the docs for shipping, I encountered a note that mentions

Streaming large amounts of CloudWatch Logs data to other
destinations might result in high usage charges. 

I want to understand what all will I be billed for while shipping the logs to ELK? How do they define large amounts?

Will I be billed for

a) Cloudwatch?

b) Log driver?

c) Lambda function? Does every log-line triggers a lambda function?

Lastly, is there still a possibility to lower the cost more?


回答1:


Personally I would look running fluent or fluentbit in another container along side your application https://docs.fluentbit.io/manual/pipeline/outputs/elasticsearch

You can send your logs direct to ES then without any cloudwatch costs.

EDIT

Here's the final solution, just in case someone is looking for a cheaper solution.

Run Fluentd/Fuentbit in another container alongside your application

Using the Github Config, I was able to forward the logs to ES with the below config.

{
    "family": "workflow",
    "cpu": "256",
    "memory": "512",
    "containerDefinitions": [
        {
            "name": "log_router",
            "image": "docker.io/amazon/aws-for-fluent-bit:latest",
            "essential": true,
            "firelensConfiguration": {
                "type": "fluentbit",
                "options":{
                   "enable-ecs-log-metadata":"true"
                }
            },
            "logConfiguration": {
                "logDriver": "awslogs",
                "options": {
                    "awslogs-create-group": "true",
                    "awslogs-group": "your_log_group",
                    "awslogs-region": "us-east-1",
                    "awslogs-stream-prefix": "ecs"
                }
            },
            "memoryReservation": 50
        },
        {
            "name": "ContainerName",
            "image": "YourImage",
            "cpu": 0,
            "memoryReservation": 128,
            "portMappings": [
                {
                    "containerPort": 5005,
                    "protocol": "tcp"
                }
            ],
            "essential": true,
            "command": [
                "YOUR COMMAND"
            ],
            "environment": [],
            "logConfiguration": {
                "logDriver": "awsfirelens",
                "secretOptions": [],
                "options": {
                    "Name": "es",
                    "Host": "YOUR_ES_DOMAIN_URL",
                    "Port": "443",
                    "tls": "On",
                    "Index": "INDEX_NAME",
                    "Type": "TYPE"
                }
            },
            "resourceRequirements": []
        }
    ]
}

The log_router container collects the logs and ships it to ES. For more info, refer Custom Log Routing

Please note that the log_router container is required in the case of Fargate, but not with ECS.

This is the cheapest solution I know which does not involves Cloudwatch, Lamdas, Kinesis.




回答2:


Like every resource, AWS charges for use and for maintenance. therefore, the charges will be for the execution of the lambda function and Storing the data in CloudWatch. the reason they mentioned that: Streaming large amounts of CloudWatch Logs data to other destinations might result in high usage charges. Is because it takes time for the lambda function to process the log and insert it into ES, When you try to stream a large number of logs the lambda function will be executed for a longer time.

  • Lambda function? Does every log-line triggers a lambda function?

    Yes, when enabling the streaming from CloudWatch to ES every log inserted to CloudWatch triggers the lambda function.

Image from demonstration (see the trigger):

  • Is there still a possibility to lower the cost more?

The only way to lower the cost (when using this implementation) is to write your own lambda function which will be triggered every X seconds\minutes and insert to log to ES. As much as I can tell the cost gap will be Meaningless.

More information:

Lambda code .

How this is working behind the scenes .



来源:https://stackoverflow.com/questions/61158970/streaming-cloudwatch-logs-to-amazon-es

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!