gsutil cannot copy to s3 due to authentication

后端 未结 2 1819
温柔的废话
温柔的废话 2021-02-14 09:45

I need to copy many (1000+) files to s3 from GCS to leverage an AWS lambda function. I have edited ~/.boto.cfg and commented out the 2 aws authentication parameters

相关标签:
2条回答
  • 2021-02-14 10:14

    As per https://issuetracker.google.com/issues/62161892, gsutil v4.28 does support AWS v4 signatures by adding to ~/.boto a new [s3] section like

    [s3]
    # Note that we specify region as part of the host, as mentioned in the AWS docs:
    # http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
    host = s3.eu-east-2.amazonaws.com
    use-sigv4 = True
    

    The use of that section is inherited from boto3 but is currently not created by gsutil config so it needs to be added explicitly for the target endpoint.

    For s3-to-GCS, I will consider the more server-less Storage Transfer Service API.

    0 讨论(0)
  • 2021-02-14 10:23

    I had a similar problem. Here is what I ended up doing on a GCE machine:

    Step 1: Using gsutil, I copied files from GCS to my GCE hard drive Step 2: Using aws cli (aws s3 cp ...), I copied files from GCE hard drive to s3 bucket

    The above methodology has worked reliably for me. I tried using gsutil rsync but it fail unexpectedly.

    Hope this helps

    0 讨论(0)
提交回复
热议问题