How to upload a file from a Docker container that runs on Fargate to S3 bucket?

旧城冷巷雨未停 提交于 2020-07-22 16:36:34

问题


I have a containerized project, the output files are written in the local container (and are deleted when the execution completes), the container runs on Fargate, I want to write a Python script that can call the model that runs on Fargate and get the output file and upload it to an S3 bucket, I'm very new to AWS and Docker, can someone send me an example or share some ideas about how to achieve this?

I think the answer by @jbleduigou makes things complicated, now I can use command to copy the file on my local machine from the container, I just need to write a script to call the model and copy this file out and upload it to S3, I know the concept but couldn't find an example.Anyone can give me an example to achieve this?


回答1:


You need to execute S3 Copy command via AWS CLI or it's equivalent in BOTO3 Python client.

$aws s3 cp /localfolder/localfile.txt s3://mybucket

Or equivalent in Python:

import boto3

client = boto3.client('s3')

response = client.put_object(
    Body='c:\HappyFace.jpg',
    Bucket='examplebucket',
    Key='HappyFace.jpg'
)

print(response)

In order for your container to have the right to upload files to S3 you need to setup Task Execution Role and assign it to your task.




回答2:


Will your python script be sitting outside of your docker container? If so you might want to mount an efs volume to your container. Copy the output files to that volume inside your container.

Then you can mount the same volume in your ec2 instance running the script.



来源:https://stackoverflow.com/questions/61104141/how-to-upload-a-file-from-a-docker-container-that-runs-on-fargate-to-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!