google-cloud-storage

Google Cloud Storage support of S3 multipart upload

ぃ、小莉子 提交于 2021-01-20 07:16:31
问题 Currenty, i'm using GCS in "interoperability mode" to make it accept S3 API requests. By using the official multipart upload example here (+ setting the appropriate endpoint), the first initiation POST request: POST /bucket/object?uploads HTTP/1.1 Host: storage.googleapis.com Authorization: AWS KEY:SIGNATURE Date: Wed, 07 Jan 2015 13:34:04 GMT User-Agent: aws-sdk-java/1.7.5 Linux/3.13.0-43-generic Java_HotSpot(TM)_64-Bit_Server_VM/24.72-b04/1.7.0_72 Content-Type: application/x-www-form

Google Cloud Storage support of S3 multipart upload

女生的网名这么多〃 提交于 2021-01-20 07:16:29
问题 Currenty, i'm using GCS in "interoperability mode" to make it accept S3 API requests. By using the official multipart upload example here (+ setting the appropriate endpoint), the first initiation POST request: POST /bucket/object?uploads HTTP/1.1 Host: storage.googleapis.com Authorization: AWS KEY:SIGNATURE Date: Wed, 07 Jan 2015 13:34:04 GMT User-Agent: aws-sdk-java/1.7.5 Linux/3.13.0-43-generic Java_HotSpot(TM)_64-Bit_Server_VM/24.72-b04/1.7.0_72 Content-Type: application/x-www-form

GOOGLE_APPLICATION_CREDENTIALS must be defined.How to set enviroment variable at Java Spring on Heroku?

核能气质少年 提交于 2021-01-07 02:48:40
问题 I am using Firebase and Google Cloud Vision api and Cloud Storage with Java Spring. I create the value of GOOGLE_APPLICATION_CREDENTIALS via eclipse on my own computer. I am using heroku for host. But I could not find how to set this enviroment variable on the server I hosted. Here is the error Message":"An error has occurred.","ExceptionMessage":"The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable

Spark: unusually slow data write to Cloud Storage

大憨熊 提交于 2021-01-07 01:24:25
问题 As the final stage of the pyspark job, I need to save 33Gb of data to Cloud Storage. My cluster is on Dataproc and consists of 15 n1-standard-v4 workers. I'm working with avro and the code I use to save the data: df = spark.createDataFrame(df.rdd, avro_schema_str) df \ .write \ .format("avro") \ .partitionBy('<field_with_<5_unique_values>', 'field_with_lots_of_unique_values>') \ .save(f"gs://{output_path}") The write stage stats from the UI: My worker stats: Quite strangely for the adequate

The caller does not have permission when attempting to use Google Cloud Storage within Cloud Run

强颜欢笑 提交于 2021-01-05 12:57:43
问题 I'm attempting to get a Node project setup on Google Cloud Run with Cloud Storage. I am running into an authentication issue when using a created Service Account. When creating the service account I did successfully download the JSON token and got everything running correctly in my local development environment. The issue is when I have the application deployed successfully to Cloud Run I get the following error: Error: The caller does not have permission This occurs when I am attempting to

The caller does not have permission when attempting to use Google Cloud Storage within Cloud Run

a 夏天 提交于 2021-01-05 12:57:13
问题 I'm attempting to get a Node project setup on Google Cloud Run with Cloud Storage. I am running into an authentication issue when using a created Service Account. When creating the service account I did successfully download the JSON token and got everything running correctly in my local development environment. The issue is when I have the application deployed successfully to Cloud Run I get the following error: Error: The caller does not have permission This occurs when I am attempting to

Why does upload_from_file Google Cloud Storage Function Throws timeout error?

馋奶兔 提交于 2021-01-04 18:40:50
问题 I created a function that works for me, It uploads a file to Google Cloud Storage. The problem is when my friend tries to upload the same file to the same bucket using the same code from his local machine, he gets timeout error . His internet is very good and he should be able to upload the file with no problems within his connections. Any idea why is this happening? def upload_to_cloud(file_path): """ saves a file in the google storage. As Google requires audio files greater than 60 seconds

Google cloud storage egress high cost, how can I reduce it?

一个人想着一个人 提交于 2021-01-04 05:38:51
问题 I have an android/ionic app running on 28 smart TVs. this app show in the TV images and videos like promotions and other informative banners, this images are stored in google cloud storage are 78 images with total size of 300mb. The app have the URL of the image and show it like a html web <img src="googlecloudimageurl"> Every 10 seconds the googlecloudimageurl changes to show the next image like a slide, every 10 seconds the image is downloaded from a Google Cloud Storage URL that is in the

Using Java to do *resumable uploads* using a *signed url* on google cloud storage

若如初见. 提交于 2021-01-01 06:58:04
问题 Based on the doc regarding how to create an object in google-cloud-storage (see "create" method in https://googleapis.github.io/google-cloud-java/google-cloud-clients/apidocs/index.html), we should be using the blob.writer(...) method when trying to upload large files, as it presumably somehow automatically handles resumable uploads. Is this right? However, if we wish to do resumable uploads on SIGNED urls, how does one do so in Java? (Any sample code or pointers would be very much

xml is shown instead of site

▼魔方 西西 提交于 2021-01-01 05:07:28
问题 I am learning google cloud so I tried to deploy my react app there using storage service. For that i created a bucket called travelfrontend and uploaded the files and folders from build folder which is generated when using yarn build . I do not have any domain so i tried to access from https://storage.googleapis.com/travelfrontend but it shows me the following I could not make it live. Did I miss anything to make it live? 回答1: In order to serve the static content generated after running yarn