问题
I have a containerized project, the output files are written in the local container (and are deleted when the execution completes), the container runs on Fargate, I want to write a Python script that can call the model that runs on Fargate and get the output file and upload it to an S3 bucket, I'm very new to AWS and Docker, can someone send me an example or share some ideas about how to achieve this?
I think the answer by @jbleduigou makes things complicated, now I can use command to copy the file on my local machine from the container, I just need to write a script to call the model and copy this file out and upload it to S3, I know the concept but couldn't find an example.Anyone can give me an example to achieve this?
回答1:
You need to execute S3 Copy command via AWS CLI or it's equivalent in BOTO3 Python client.
$aws s3 cp /localfolder/localfile.txt s3://mybucket
Or equivalent in Python:
import boto3
client = boto3.client('s3')
response = client.put_object(
Body='c:\HappyFace.jpg',
Bucket='examplebucket',
Key='HappyFace.jpg'
)
print(response)
In order for your container to have the right to upload files to S3 you need to setup Task Execution Role and assign it to your task.
回答2:
Will your python script be sitting outside of your docker container? If so you might want to mount an efs volume to your container. Copy the output files to that volume inside your container.
Then you can mount the same volume in your ec2 instance running the script.
来源:https://stackoverflow.com/questions/61104141/how-to-upload-a-file-from-a-docker-container-that-runs-on-fargate-to-s3-bucket