IO Error in boto3 download_file

匿名 (未验证) 提交于 2019-12-03 01:34:02

问题:

Background

I am using boto3 code to download file from s3.Here is the following code.

for record in event['Records']:     bucket = record['s3']['bucket']['name']     key = record['s3']['object']['key']     print (key)     if key.find('/') < 0 :     if len(key) > 4 and key[-5:].lower() == '.json': //File is uploaded outside any folder          download_path = '/tmp/{}{}'.format(uuid.uuid4(), key)     else:         download_path = '/tmp/{}/{}'.format(uuid.uuid4(), key)//File is uploaded inside a folder 

If a new file is uploaded in s3 bucket,this code is triggered and that newly uploaded file is downloaded by this code.

This code works fine,when uploaded outside any folder.

However,when i upload a file inside a directory,IO error happens. Here is a dump of the IO error i am encountering.

[Errno 2] No such file or directory: /tmp/316bbe85-fa21-463b-b965-9c12b0327f5d/test1/customer1.json.586ea9b8: IOError

test1 is the directory inside my s3 bucket,where customer1.json is uploaded.

Query

Any thoughts on ,how to resolve this error?

回答1:

Error raised because you attempted to download and save file into directory which not exists. Use os.mkdir prior downloading file to create an directory.

# ... else:     item_uuid = str(uuid.uuid4())     os.mkdir('/tmp/{}'.format(item_uuid))     download_path = '/tmp/{}/{}'.format(item_uuid, key)  # File is uploaded inside a folder 

Note: It's better to use os.path.join() while operating with systems paths. So code above could be rewritten to:

# ... else:     item_uuid = str(uuid.uuid4())     os.mkdir(os.path.join(['tmp', item_uuid]))     download_path = os.path.join(['tmp', item_uuid, key])) 

Also error may be raises because you including '/tmp/' in download path for s3 bucket file, do not include tmp folder as likely it's not exists on s3. Ensure you are on the right way by using that articles:



回答2:

thanks for helping Andriy Ivaneyko,I found an solution using boto3.

Using this following code i am able to accomplish my task.

for record in event['Records']:     bucket = record['s3']['bucket']['name']     key = record['s3']['object']['key']     fn='/tmp/xyz'     fp=open(fn,'w')     response = s3_client.get_object(Bucket=bucket,Key=key)     contents = response['Body'].read()     fp.write(contents)     fp.close() 


标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!