google-cloud-platform

How to build google cloud serverless function for processing unprocess data from firebase

邮差的信 提交于 2021-02-17 06:43:22
问题 I am having a problem while creating a serverless google cloud function I need to write a Google Cloud Function in Python to run this script that will process unprocessed firebase raw data I already deployed it on GCP but on triggering it is trowing: error request cannot handle I want to run the script serverless to avoid it from manually running every day Can anyone help me out with this def process_session(self, session, utc_offset=0): s = {} try: edfbyte, analysis = process_session(session

Spark-HBase - GCP template (1/3) - How to locally package the Hortonworks connector?

此生再无相见时 提交于 2021-02-17 06:30:36
问题 I'm trying to test the Spark-HBase connector in the GCP context and tried to follow [1], which asks to locally package the connector [2] using Maven (I tried Maven 3.6.3) for Spark 2.4, and leads to following issue. Error "branch-2.4": [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1]

How to send images from ESP32 CAM to IoT Core?

杀马特。学长 韩版系。学妹 提交于 2021-02-17 06:20:59
问题 I need the system to be secure. I tired to encode the image with base64 and sending the string via MQTT to Iot Core. Then decode the string with a cloud function and finally storage the decoded image in Google Cloud Storage. The problem is the limited size of a message in MQTT. Using a Cloud Function and then storage in Google Cloud Storage is not really secure, anyone could hit that url and I loos control of all the ESP32CAM comunication. Am I missing something? is there a really secure way

BigQuery automatically converts timestamp timezone to UTC

给你一囗甜甜゛ 提交于 2021-02-17 03:21:26
问题 I have a table as such: and a file as such: https://storage.googleapis.com/test_share_file/testTimestamp.csv which looks like: and I load the file to big query using python as such: from google.cloud import bigquery as bq gs_path = 'gs://test_share_file/testTimestamp.csv' bq_client = bq.Client.from_service_account_json(gcp_creds_fp) ds = bq_client.dataset('test1') tbl = ds.table('testTimestamp') job_config = bq.LoadJobConfig() job_config.write_disposition = bq.job.WriteDisposition.WRITE

Unexpected error when loading the model: problem in predictor - ModuleNotFoundError: No module named 'torchvision'

一世执手 提交于 2021-02-16 21:30:45
问题 I've been trying to deploy my model to the AI platform for Prediction through the console on my vm instance, but I've gotten the error "(gcloud.beta.ai-platform.versions.create) Create Version failed. Bad model detected with error: "Failed to load model: Unexpected error when loading the model: problem in predictor - ModuleNotFoundError: No module named 'torchvision' (Error code: 0)" I need to include both torch and torchvision . I followed the steps in this question Cannot deploy trained

Cannot connect to Mongo Atlas using VPC peering from GCP cluster

天大地大妈咪最大 提交于 2021-02-15 05:32:11
问题 I am trying to connect a Java app running on a GCP Kubernetes engine cluster, with a Mongo Atlas cluster (M20). Before, it worked fine, when I didn't have VPC Peering turned on and I was using the regular connection string. But I am trying to use VPC Peering now, with the default VPC network in my GCP project. I followed the steps in https://docs.atlas.mongodb.com/security-vpc-peering/. I chose Atlas CIDR of 192.168.0.0/18 (b/c "The Atlas CIDR block must be at least a /18"), and after linking

MongoDB and Google Cloud Functions VPC Peering?

非 Y 不嫁゛ 提交于 2021-02-13 17:33:23
问题 I've having issues accessing MongoDB Atlas from Google Cloud functions. It is giving me error regarding IP Whitelisting but I've added both (Serverless VPC Access) IP address range and VPC Network Peering IP address range to MongoDB whitelist. I've also created MongoDB peering with google cloud. If I allow (access from anywhere) then my mongodb starts working fine, otherwise it gives error regarding IP whitelisting. I'm not sure what else I should add to MongoDB whitelist when I've added both

MongoDB and Google Cloud Functions VPC Peering?

ぐ巨炮叔叔 提交于 2021-02-13 17:33:21
问题 I've having issues accessing MongoDB Atlas from Google Cloud functions. It is giving me error regarding IP Whitelisting but I've added both (Serverless VPC Access) IP address range and VPC Network Peering IP address range to MongoDB whitelist. I've also created MongoDB peering with google cloud. If I allow (access from anywhere) then my mongodb starts working fine, otherwise it gives error regarding IP whitelisting. I'm not sure what else I should add to MongoDB whitelist when I've added both

Google App Script - “message”: “Requested entity was not found.” with devMode = false

一个人想着一个人 提交于 2021-02-11 18:24:44
问题 I am trying to connect my Python script with my project in Google App Script. I have followed all the insctructions in this guide. I have of course deployed it as an executable API and have tested it with access to only myself, my organization and anyone options. When I pass the request with devMode as true , it all works fine. I understand that in this case, it is running the latest saved version. However when I set it to false then I get back the error "message": "Requested entity was not

Google App Script - “message”: “Requested entity was not found.” with devMode = false

风格不统一 提交于 2021-02-11 18:24:16
问题 I am trying to connect my Python script with my project in Google App Script. I have followed all the insctructions in this guide. I have of course deployed it as an executable API and have tested it with access to only myself, my organization and anyone options. When I pass the request with devMode as true , it all works fine. I understand that in this case, it is running the latest saved version. However when I set it to false then I get back the error "message": "Requested entity was not