Reading the data written to s3 by Amazon Kinesis Firehose stream

后端 未结 9 2083
感情败类
感情败类 2021-02-18 15:17

I am writing record to Kinesis Firehose stream that is eventually written to a S3 file by Amazon Kinesis Firehose.

My record object looks like

ItemPurcha         


        
9条回答
  •  自闭症患者
    2021-02-18 15:39

    It boggles my mind that Amazon Firehose dumps JSON messages to S3 in this manner, and doesn't allow you to set a delimiter or anything.

    Ultimately, the trick I found to deal with the problem was to process the text file using the JSON raw_decode method

    This will allow you to read a bunch of concatenated JSON records without any delimiters between them.

    Python code:

    import json
    
    decoder = json.JSONDecoder()
    
    with open('giant_kinesis_s3_text_file_with_concatenated_json_blobs.txt', 'r') as content_file:
    
        content = content_file.read()
    
        content_length = len(content)
        decode_index = 0
    
        while decode_index < content_length:
            try:
                obj, decode_index = decoder.raw_decode(content, decode_index)
                print("File index:", decode_index)
                print(obj)
            except JSONDecodeError as e:
                print("JSONDecodeError:", e)
                # Scan forward and keep trying to decode
                decode_index += 1
    

提交回复
热议问题