Put json data pipeline definition using Boto3

£可爱£侵袭症+ 提交于 2019-12-24 08:02:15

问题


I have a data pipeline definition in json format, and I would like to 'put' that using Boto3 in Python.

I know you can do this via the AWS CLI using put-pipeline-definition, but Boto3 (and the AWS API) use a different format, splitting the definition into pipelineObjects, parameterObjects and parameterValues.

Do I need to write code to translate from a json definition to that expected by the API/Boto? If so, is there a library that does this?


回答1:


The AWS CLI has code that does this translation, so I can borrow that!




回答2:


You could convert from the Data Pipeline exported JSON format to the pipelineObjects format expected by boto3 using a python function of the following form.

def convert_to_pipeline_objects(pipeline_definition_dict):
    objects_list = []
    for def_object in pipeline_definition_dict['objects']:
        new_object = {
            'id': def_object['id'],
            'name': def_object['name'],
            'fields': []
        }
        for key in def_object.keys():
            if key in ('id', 'name'):
                continue
            if type(def_object[key]) == dict:
                new_object['fields'].append(
                    {
                        'key': key,
                        'refValue': def_object[key]['ref']
                    }
                )
            else:
                new_object['fields'].append(
                    {
                        'key': key,
                        'stringValue': def_object[key]
                    }
                )
        objects_list.append(new_object)


来源:https://stackoverflow.com/questions/38582478/put-json-data-pipeline-definition-using-boto3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!