Using Python to Query GCP Stackdriver logs

让人想犯罪 __ 提交于 2020-06-15 05:59:22

问题


I am using Python3 to query Stackdriver for GCP logs. Unfortunately, the log entries that have important data are returned to me as "NoneType" instead of as a "dict" or a "str". The resulting "entry.payload" is type "None" and the "entry.payload_pb" has the data I want, but it is garbled.

Is there a way to get Stackdriver to return this data in a clean format, or is there a way I can parse it? If not, is there a way I should query this data that is better than what I am doing and yields clean data?

My code looks something like this:

#!/usr/bin/python3

from google.cloud.logging import Client, ASCENDING, DESCENDING
from google.oauth2.service_account import Credentials

projectName = 'my_project'
myFilter = 'logName="projects/' + projectName + '/logs/compute.googleapis.com%2Factivity_log"'

client = Client(project = projectName)
entries = client.list_entries(order_by=DESCENDING, page_size = 500, filter_ = myFilter)
for entry in entries:
    if isinstance(entry.payload, dict):
        print(entry.payload)
    if isinstance(entry.payload, str):
        print(entry.payload)
    if isinstance(entry.payload, None):
        print(entry.payload_pb)

The "entry.payload_pb" data always starts like this:

type_url: "type.googleapis.com/google.cloud.audit.AuditLog"
 value: "\032;\n9gcp-user@my-project.iam.gserviceaccount.com"I\n\r129.105.16.28\0228

回答1:


It looks like something is broken in python library related to parsing protobuf for logging. I found two old issues

  1. https://github.com/GoogleCloudPlatform/google-cloud-python/issues/3218
  2. https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2674

that seems to be resolved sometime ago - but I believe problem was reintroduced. I have ticket opened for google support on this issue and they are looking into it.

As workaround - you can use two options:

  1. You can create export (sink) to BigQuery - so in this case you query your log easily - problem with this approach it does not export old data that you collect before creating export.
  2. You can use gcloud command. Especially

    gcloud logging read

It is very powerful (supports filters, timestamps) - but its output format is yaml. You can install and use PyYAML library to convert logs to dictionary.




回答2:


The LogEntry.proto_payload is an Any message, which encodes some other proto buffer message. The type of proto message is indicated by type_url, and the body of the message is serialized into the value field. After identifying the type, you can de-serialize it with something like

from google.cloud.audit import AuditLog
...

audit_log = AuditLog()
audit_log.ParseFromString(entry.payload_pb.value)

The AuditLog message is available at https://github.com/googleapis/googleapis/blob/master/google/cloud/audit/audit_log.proto and the corresponding Python definitions can be built using the protoc compiler

Note that some fields of the AuditLog message can contain other Any messages too. There are more details at https://cloud.google.com/logging/docs/audit/api/




回答3:


In case anyone has the same issue that I had, here's how I solved it:

1) Download and install protobuf. I did this on a mac with brew (brew install protobuf)
2) Download and install grpcio. I used pip install grpcio
3) Download the "Google APIs" to a known directory. I used /tmp, and this command git clone https://github.com/googleapis/googleapis
4) Change directories to the root directory of the repository you just downloaded in Step 3
5) Use protoc to build the python repository. This command worked for me
protoc -I=/tmp/googleapis/ --python_out=/tmp/ /tmp/googleapis/google/cloud/audit/audit_log.proto
6) Your audit_log_pb2.py file should exist in /tmp/audit_log_pb2.py
7) Place this file in the proper path OR in the same directory as your script.
8) Add this line to the imports in your script:
import audit_log_pb2
9) After I did this, the entry.payload portion of the Protobuf entry was consistently populated with dicts.

PLEASE NOTE: You should verify what version of protoc you are using with the following command protoc --version. You really want to use protoc 3.x, because the file we are building from is from version 3 of the spec. The Ubuntu package I installed on a Linux box was version 2, and this was kind of frustrating. Also, although this file was built for Python 2.x, it seems to work fine with Python 3.x.




回答4:


Actually I missed that but you can disable gRPC and make the API return a dict (JSON) payload by setting the environment variable GOOGLE_CLOUD_DISABLE_GRPC to a non-empty string, e.g. GOOGLE_CLOUD_DISABLE_GRPC=true.

This will populate the payload instead of payload_pb - easier than compiling a proto buffer which may be out-of-date !




回答5:


I followed @rhinestone-cowguy's answer, but think example usage will help people who find this answer. To use the compiled (proto) code:

from google.cloud import logging
import audit_log_pb2

client = logging.Client()
PROJECT_IDS = ["one-project", "another-project"]

for entry in client.list_entries(projects=PROJECT_IDS):  # API call(s)

    # The proto payload is an Any message.
    audit_log = audit_log_pb2.AuditLog()
    entry.payload.Unpack(audit_log)
    print(audit_log)

The use of the Any message is documented in Python Generated Code.



来源:https://stackoverflow.com/questions/50301632/using-python-to-query-gcp-stackdriver-logs

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!