gcp

Google cloud storage signed url chunked download In python?

会有一股神秘感。 提交于 2019-12-11 17:37:48
问题 I'd like to do the following functionality: service = get_service('storage', 'v1', auth) data = BytesIO() request = service.objects().get_media(bucket=bucket, object=filename) media = MediaIoBaseDownload(data, request, chunksize=chunksize) But with a signed URL instead if a bucket and object. How do I chunk download a signed url in python? 回答1: You can create s Signed URL either using gsutil or writing your own code to generate it. I have tested it using this: gsutil signurl -d 10m my-key

Datastore Queries to retrive multiple values in GCP Nodejs

两盒软妹~` 提交于 2019-12-11 05:46:34
问题 How to retrieve multiple values in single field in datastore nodejs. code: const query = datastore.createQuery('Task') .filter('user_id', '=', [1,2,3]) .order('priority', { descending: true }); this is not working. I need query something like this select userName from Table where user_id in (1, 2, 3); 回答1: You can query to retrieve multiple values like the following: If you have array of multiple datastore Id's like following: [1,2,3,4] You can query with gstore-node Package var userData =

GCP Compute Engine Firewall Rules for TCP Server

你离开我真会死。 提交于 2019-12-10 21:14:47
问题 I have created a GCP compute engine instance with a static external ip address. Machine type: n1-standard-2 (2 vCPUs, 7.5 GB memory). OS is Linux/Debian. My intention is to create a plain Node.js TCP server on the machine. The code is as follows: var net = require('net'); var HOST = '0.0.0.0'; var PORT = 110; net.createServer(function(sock) { console.log('CONNECTED: ' + sock.remoteAddress +':'+ sock.remotePort); sock.on('data', function(data) { console.log('DATA ' + sock.remoteAddress + ': '

Unable to re-enable Google Cloud AppEngine Application

老子叫甜甜 提交于 2019-12-10 18:36:55
问题 I used to run a NodeJS Flexible Environment AppEngine application on my project and later disabled it. Now i am trying to deploy a Java based Standard Environment AppEngine application to the same project. When i tried to deploy i got following error [INFO] GCLOUD: ERROR: (gcloud.app.deploy) Unable to deploy to application [project_name] with status [USER_DISABLED]: Deploying to stopped apps is not allowed. Then realised i disabled the AppEngine in past. Now when i goto AppEngine > Settings

How to read BigQuery table using python pipeline code in GCP Dataflow

巧了我就是萌 提交于 2019-12-10 17:16:57
问题 Could someone please share syntax to read/write bigquery table in a pipeline written in python for GCP Dataflow 回答1: Run on Dataflow First, construct a Pipeline with the following options for it to run on GCP DataFlow: import apache_beam as beam options = {'project': <project>, 'runner': 'DataflowRunner', 'region': <region>, 'setup_file': <setup.py file>} pipeline_options = beam.pipeline.PipelineOptions(flags=[], **options) pipeline = beam.Pipeline(options = pipeline_options) Read from

How do I connect to a dataproc cluster with Jupyter notebooks from cloud shell

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-08 04:05:51
问题 I have seen the instructions here https://cloud.google.com/dataproc/docs/tutorials/jupyter-notebook for setting up Jupyter notebooks with dataproc but I can't figure out how to alter the process in order to use Cloud shell instead of creating an SSH tunnel locally. I have been able to connect to a datalab notebook by running datalab connect vmname from the cloud shell and then using the preview function. I would like to do something similar but with Jupyter notebooks and a dataproc cluster.

How to write dictionaries to Bigquery in Dataflow using python

大兔子大兔子 提交于 2019-12-08 03:04:21
问题 I am trying to read from a csv from in GCP Storage, converting that into dictionaries and then write to a Bigquery table as follows: p | ReadFromText("gs://bucket/file.csv") | (beam.ParDo(BuildAdsRecordFn())) | WriteToBigQuery('ads_table',dataset='dds',project='doubleclick-2',schema=ads_schema) where: 'doubleclick-2' and 'dds' are existing project and dataset, ads_schema is defined as follows: ads_schema='Advertiser_ID:INTEGER,Campaign_ID:INTEGER,Ad_ID:INTEGER,Ad_Name:STRING,Click_through_URL

Firestore: Transactions giving permission denied

蹲街弑〆低调 提交于 2019-12-07 15:26:39
问题 We are accessing Firestore from our Java app engine instance. Non-transactional requests are succeeding fine, but transactions are failing with the error: firestore: PERMISSION_DENIED: Missing or insufficient permissions Example Transaction final long updatedValue = 15; Firestore db = firebaseManager.getFirestore(); CollectionReference fooCollectionRef = db.collection(SOME_COLLECTION); DocumentReference fooDocumentRef = fooCollectionRef.document(fooId); final ApiFuture<Long> future = db

Google Cloud Stackdriver Monitor Compute Engine Disk Usage

不问归期 提交于 2019-12-07 02:24:16
问题 I have Google compute engine instances already up and running since recently. I have explored Google Cloud stackdriver for monitoring CPU Usage etc. I have installed Stackdriver agent on to one of the Compute Engine instance for testing. I have explored creating new chart on dashboard, tried with various metrics. But I could not find any metrics that can show disk usage of my instance. Yes there is list of plugins supported by Stackdriver agent to pump custom metrics but I could not find any

Copy files from one Google Cloud Storage Bucket to other using Apache Airflow

本秂侑毒 提交于 2019-12-06 15:27:33
Problem : I want to copy files from a folder in Google Cloud Storage Bucket (e.g Folder1 in Bucket1) to another Bucket (e.g Bucket2). I can't find any Airflow Operator for Google Cloud Storage to copy files. I know this is an old question but I found myself dealing with this task too. Since I'm using the Google Cloud-Composer, GoogleCloudStorageToGoogleCloudStorageOperator was not available in the current version. I managed to solve this issue by using a simple BashOperator from airflow.operators.bash_operator import BashOperator with models.DAG( dag_name, schedule_interval=timedelta(days=1),