google-cloud-storage

How do I shorten Google Cloud Storage Signed download URLs?

别等时光非礼了梦想. 提交于 2020-08-27 21:49:40
问题 I have a firebase app, and I use Firebase Storage to upload images. the URL s i get back when I use the firebase web sdk to upload are reasonable: https://firebasestorage.googleapis.com/v0/b/projectId.appspot.com/o/image.jpg?alt=media&token=51183d4a-551a-41e2-b620-14b44b8c86ed However, since Firebase doesn't support the storage API in their node.js SDK, I have to use the Google Cloud Storage SDK : bucketRef.upload(localImagePath, options, (err, file, response) => { file.getSignedUrl({ action:

CORS policy with Google Storage allows from my origin, but no 'Access-Control-Allow-Origin' header is present

橙三吉。 提交于 2020-08-10 20:02:31
问题 I'm new to CORS configuration and trying to figure this out, but my set up looks like it is right according to the documentation. I'm hoping you can help me see what I've missed. My code is trying to upload ( PUT ) a file directly to google storage using a signed url. Access to XMLHttpRequest at 'https://storage.googleapis.com/herdboss-dev.appspot.com/uploads/152/152-owner-152-61.jpg?X-Goog-Algorithm=GOOG4-RSA-SHA256&X-Go...' from origin 'https://herdboss-dev.appspot.com' has been blocked by

Drop Hive Table & msck repair fails with Table stored in google cloud bucket

谁都会走 提交于 2020-08-10 03:38:32
问题 I am creating hive table in Google Cloud Bucket using below SQL statement. CREATE TABLE schema_name.table_name (column1 decimal(10,0), column2 int, column3 date) PARTITIONED BY(column7 date) STORED AS ORC LOCATION 'gs://crazybucketstring/' TBLPROPERTIES('ORC.COMPRESS'='SNAPPY'); Then I loaded data into this table using distcp command, Now when I try to Drop table it fails with below error message, Even if I try to drop empty table it fails. hive>>DROP TABLE schema_name.table_name; **Error:**

Drop Hive Table & msck repair fails with Table stored in google cloud bucket

你离开我真会死。 提交于 2020-08-10 03:38:00
问题 I am creating hive table in Google Cloud Bucket using below SQL statement. CREATE TABLE schema_name.table_name (column1 decimal(10,0), column2 int, column3 date) PARTITIONED BY(column7 date) STORED AS ORC LOCATION 'gs://crazybucketstring/' TBLPROPERTIES('ORC.COMPRESS'='SNAPPY'); Then I loaded data into this table using distcp command, Now when I try to Drop table it fails with below error message, Even if I try to drop empty table it fails. hive>>DROP TABLE schema_name.table_name; **Error:**

Connect Apache Drill to Google Cloud

╄→尐↘猪︶ㄣ 提交于 2020-08-09 09:01:44
问题 How do I connect google cloud buckets to Apache Drill. I want to connect Apache Drill to google cloud storage buckets and fetch data from the file files stored in those buckets. I can specify access id and key in core-site.xml in order to connect to AWS. Is there a similar way to connect drill to google cloud. 回答1: I found the answer here useful: Apache Drill using Google Cloud Storage On Google Cloud Dataproc you can set it up with an initialization action as in the answer above. There's

how to load data to jupyter notebook VM from google cloud?

送分小仙女□ 提交于 2020-08-08 18:51:40
问题 I am trying to load a bunch of csv files stored on my google cloud into my jupyter notebook. I use python 3 and gsutil does not work. Lets's assume I have 6 .csv files in '\bucket1\1'. does anybody know what I should do? 回答1: You are running a Jupyter Notebook on a Google Cloud VM instance. And you want to load 6 .csv files (that you currently have on your Cloud Storage) into it. Install the dependencies: pip install google-cloud-storage pip install pandas Run the following script on your

how to load data to jupyter notebook VM from google cloud?

社会主义新天地 提交于 2020-08-08 18:46:41
问题 I am trying to load a bunch of csv files stored on my google cloud into my jupyter notebook. I use python 3 and gsutil does not work. Lets's assume I have 6 .csv files in '\bucket1\1'. does anybody know what I should do? 回答1: You are running a Jupyter Notebook on a Google Cloud VM instance. And you want to load 6 .csv files (that you currently have on your Cloud Storage) into it. Install the dependencies: pip install google-cloud-storage pip install pandas Run the following script on your