google-cloud-storage

how to load data to jupyter notebook VM from google cloud?

纵然是瞬间 提交于 2020-08-08 18:46:06
问题 I am trying to load a bunch of csv files stored on my google cloud into my jupyter notebook. I use python 3 and gsutil does not work. Lets's assume I have 6 .csv files in '\bucket1\1'. does anybody know what I should do? 回答1: You are running a Jupyter Notebook on a Google Cloud VM instance. And you want to load 6 .csv files (that you currently have on your Cloud Storage) into it. Install the dependencies: pip install google-cloud-storage pip install pandas Run the following script on your

BigQuery use conditions to create a table from other tables (manage big number of columns)

夙愿已清 提交于 2020-08-08 05:15:35
问题 I am facing an issue related to a project of mine. Here is the summary of what i would like to do : I have a big daily file (100 Go) with the following extract (no header) : ID_A|segment_1 ID_A|segment_2 ID_B|segment_2 ID_B|segment_3 ID_B|segment_4 ID_B|segment_5 ID_C|segment_1 ID_D|segment_2 ID_D|segment_4 Every ID (from A to D) can be linked to one or multiple segments (from 1 to 5). I would like to process this file in order to have the following result (the result file contains a header)

BigQuery use conditions to create a table from other tables (manage big number of columns)

 ̄綄美尐妖づ 提交于 2020-08-08 05:15:15
问题 I am facing an issue related to a project of mine. Here is the summary of what i would like to do : I have a big daily file (100 Go) with the following extract (no header) : ID_A|segment_1 ID_A|segment_2 ID_B|segment_2 ID_B|segment_3 ID_B|segment_4 ID_B|segment_5 ID_C|segment_1 ID_D|segment_2 ID_D|segment_4 Every ID (from A to D) can be linked to one or multiple segments (from 1 to 5). I would like to process this file in order to have the following result (the result file contains a header)

How to properly authorize request to Google Cloud Storage API?

我与影子孤独终老i 提交于 2020-08-08 03:41:51
问题 I am trying to use the Google Cloud Storage JSON API to retrieve files from a bucket using http calls. I am curling from a Container in GCE within the same project as the storage bucket, and the service account has read access to the bucket Here is the pattern of the requests: https://storage.googleapis.com/{bucket}/{object} According to the API console, I don't need anything particular as the service account provides Application Default Credentials. However, I keep having this: Anonymous

My GCP project is automatically creating storage buckets

不想你离开。 提交于 2020-08-05 19:21:05
问题 I have deployed a node.js API on Google App Engine which uses cloud storage for storing files. I had created a bucket, which bore the title as <my-project-id>.appspot.com . After a few days I have come to realize that there are 2 more storage buckets which were created without my knowledge, with the titles: staging.<my-project-id>.appspot.com and <zone-name>.artifacts.<my-project-id>.appspot.com . Why are these buckets getting created? Are those meant for backups? Am I being charged for those

Upload file to google cloud storage with NodeJS

筅森魡賤 提交于 2020-08-01 17:41:20
问题 My upload.js file contains the following code: module.exports = { up: function () { const storage = require('@google-cloud/storage'); const fs = require('fs'); const gcs = storage({ projectId: 'MY_PROJECT_ID', keyFilename: './service-account.json' }); var bucket = gcs.bucket('MY_BUCKET'); bucket.upload('picture.jpg', function(err, file) { if (err) throw new Error(err); }); }, } It works through the terminal but how do I call it on form submission button click or just from different file ?

How to connect to private storage bucket using the Google Colab TPU

半世苍凉 提交于 2020-07-30 21:35:42
问题 I am using google colab pro and the provided TPU. I need to upload a pre-trained model into the TPU. TPU can load data only from a google cloud storage bucket. I created a cloud storage bucket and extracted the pre-trained model files in the bucket. Now I need to give permission to the TPU to access my private bucket, but I don't know the service account of the TPU. How do I find it? For now I just have All:R read permission to the bucket and the TPU initialized successfully but clearly this

How do I find the owner of a Google Cloud Storage object

独自空忆成欢 提交于 2020-07-23 12:09:11
问题 My App Engine is run by a service account. It writes files to a bucket. How can I see the service account as owner of the file objects? 回答1: You can use the gsutil ls command with the -L option: gsutil ls -L gs://your-bucket/your-object This will print the entities which have permissions on the object. One of them will be the service account which created the object: { "email": "your-service-account@appspot.gserviceaccount.com", "entity": "user-your-service-account@appspot.gserviceaccount.com

How do I find the owner of a Google Cloud Storage object

倾然丶 夕夏残阳落幕 提交于 2020-07-23 12:08:01
问题 My App Engine is run by a service account. It writes files to a bucket. How can I see the service account as owner of the file objects? 回答1: You can use the gsutil ls command with the -L option: gsutil ls -L gs://your-bucket/your-object This will print the entities which have permissions on the object. One of them will be the service account which created the object: { "email": "your-service-account@appspot.gserviceaccount.com", "entity": "user-your-service-account@appspot.gserviceaccount.com

How do I find the owner of a Google Cloud Storage object

家住魔仙堡 提交于 2020-07-23 12:06:47
问题 My App Engine is run by a service account. It writes files to a bucket. How can I see the service account as owner of the file objects? 回答1: You can use the gsutil ls command with the -L option: gsutil ls -L gs://your-bucket/your-object This will print the entities which have permissions on the object. One of them will be the service account which created the object: { "email": "your-service-account@appspot.gserviceaccount.com", "entity": "user-your-service-account@appspot.gserviceaccount.com