Reading the data written to s3 by Amazon Kinesis Firehose stream

后端 未结 9 2055
感情败类
感情败类 2021-02-18 15:17

I am writing record to Kinesis Firehose stream that is eventually written to a S3 file by Amazon Kinesis Firehose.

My record object looks like

ItemPurcha         


        
相关标签:
9条回答
  • 2021-02-18 15:38

    Use this simple Python code.

    input_str = '''{"personId":"p-111","itemId":"i-111"}{"personId":"p-222","itemId":"i-222"}{"personId":"p-333","itemId":"i-333"}'''
    
    data_str = "[{}]".format(input_str.replace("}{","},{"))
    data_json = json.loads(data_str)
    

    And then (if you want) convert to Pandas.

    import pandas as pd   
    df = pd.DataFrame().from_records(data_json)
    print(df)
    

    And this is result

    itemId personId
    0  i-111    p-111
    1  i-222    p-222
    2  i-333    p-333
    
    0 讨论(0)
  • 2021-02-18 15:39

    It boggles my mind that Amazon Firehose dumps JSON messages to S3 in this manner, and doesn't allow you to set a delimiter or anything.

    Ultimately, the trick I found to deal with the problem was to process the text file using the JSON raw_decode method

    This will allow you to read a bunch of concatenated JSON records without any delimiters between them.

    Python code:

    import json
    
    decoder = json.JSONDecoder()
    
    with open('giant_kinesis_s3_text_file_with_concatenated_json_blobs.txt', 'r') as content_file:
    
        content = content_file.read()
    
        content_length = len(content)
        decode_index = 0
    
        while decode_index < content_length:
            try:
                obj, decode_index = decoder.raw_decode(content, decode_index)
                print("File index:", decode_index)
                print(obj)
            except JSONDecodeError as e:
                print("JSONDecodeError:", e)
                # Scan forward and keep trying to decode
                decode_index += 1
    
    0 讨论(0)
  • 2021-02-18 15:40

    I've had the same issue.

    It would have been better if AWS allowed us to set a delimiter but we can do it on our own.

    In my use case, I've been listening on a stream of tweets, and once receiving a new tweet I immediately put it to Firehose.

    This, of course, resulted in a 1-line file which could not be parsed.

    So, to solve this, I have concatenated the tweet's JSON with a \n. This, in turn, let me use some packages that can output lines when reading stream contents, and parse the file easily.

    Hope this helps you.

    0 讨论(0)
提交回复
热议问题