How should I investigate a memory leak when using Google Cloud Datastore Python libraries?
问题 I have a web application which uses Google's Datastore, and has been running out of memory after enough requests. I have narrowed this down to a Datastore query. A minimum PoC is provided below, a slightly longer version which includes memory measuring is on Github. from google.cloud import datastore from google.oauth2 import service_account def test_datastore(entity_type: str) -> list: creds = service_account.Credentials.from_service_account_file("/path/to/creds") client = datastore.Client