Log retention in Stackdriver GCP

拟墨画扇 提交于 2020-07-20 07:48:06

问题


How can I get log retention enabled in GCP Stack-driver. I haven't found any document for configuring log retention. I can see export option in logging section and log ingestion.


回答1:


NOW, it is possible, see this post bellow (edited)


Previous answer:

Logging retention is 30 days and it is not configurable, you only pay for the storage

Stackdriver Logging allows you to retain the logs for 30 days, and gives you a one-click configuration tool to archive data for a longer period in Google Cloud Storage.

https://cloud.google.com/logging/

But you can create a sink for you logs and store them in Big Query or Google Cloud Storage (or both of them)




回答2:


Log Retention is now POSSIBLE.

Use this documentation for custom retention periods. This can be between 1 day and 3650 days.

gcloud beta logging buckets update _Default --location=global --retention-days=[RETENTION_DAYS]

Explanation:

For each Google Cloud project, Logging automatically creates two logs buckets: _Required and _Default. All logs generated in the project are stored in the _Required and _Default logs buckets, which live in the project that the logs are generated in:

_Required: This bucket holds Admin Activity audit logs, System Event audit logs, and Access Transparency logs, and retains them for 400 days. You aren't charged for the logs stored in _Required, and the retention period of the logs stored here cannot be modified. You cannot delete this bucket.

_Default: This bucket holds all other ingested logs in a Google Cloud project except for the logs held in the _Required bucket. Standard Cloud Logging pricing applies to these logs. Log entries held in the _Default bucket are retained for 30 days, unless you apply custom retention rules.

With custom buckets and the _Default bucket, you can configure custom retention periods for different logs.




回答3:


In addition to the 30 days, audit logs are retained for 400 days.

The Design Patterns for Logging Exports covers the specifics of exports to GCS, BigQuery and PubSub (for streaming logs).




回答4:


Now, you can configure custom retention by the command:

gcloud alpha logging buckets update _Default --location=global --retention-days=[RETENTION_DAYS]

See https://cloud.google.com/logging/docs/storage#logs-retention




回答5:


Stackdriver retention

  1. Admin Activity (400 days)
  2. Data Access (30 days)
  3. System Event (400 days)
  4. Other logs (30 days)

Bullet 1 to 3 are audit logs, which you can enable on the iam > audit logs page. Remember that this can be a large stream of logs. This is especially the case for data access logs, since the log every access object in for example GCS (Google Cloud Storage) or Cloud Datastore. Some best practices are that you turn off audit logging for development or only turn on audit logging for services that you are frequently using (KMS, IAM, storage etc.) and turn audit logging off for Cloud Build, Cloud Functions etc.

Bullet 4, other logs, could be application logging from Cloud Functions, App Engine etc. This is the logging that comes from the applications that your run on GCP. For all the Stackdriver logging, retention is now configurable (April 2020). You can read more about this here.

Want to store the logging for a longer period of time?

There are many use cases in which you want to maintain logging for a longer period of time. This may be for analytical purposes, monitoring or compliance reasons. You can export logs with a logsink on project, directory or even organization level. Logsinks itself are free, you only pay for the storage of the destination, which could be one of the following:

  • Google cloud storage
  • Pubs/sub
  • Bigquery

Pub/sub could be a neat solution if you want to move the logging export somewhere else, possibly external of GCP. I recently learned that Google Cloud Storage and Big Query do not differ a lot when it comes to storage cost. They both offer lower class storage tears, for longer term storage.

Logsink best practices

  • For a lot of use-cases, Big Query may be a fine solution. Storage is comparable to Big Query standard and nearline storage classes. And you still have the possibility to easily query the data.
  • Using Big Query, don't run queries that cover a lot of data too often. This may become expensive.
  • Using Big Query, partition your data so that every day or every week data is inserted in a new partition or table. Which in turn automatically reduces the storage cost of the tables that are not updated with 50%.
  • When you have to store data for several years (3, 5 or even 7 years) because of compliance reasons. I would recommend exporting the data to google cloud storage. By object lifecycle management you can put this data in archive storage, which costs only a fraction (15%) of standard storage.


来源:https://stackoverflow.com/questions/51942237/log-retention-in-stackdriver-gcp

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!