Python: Efficient way to split list of strings into smaller chunks by concatenated size

ぃ、小莉子 提交于 2019-12-12 10:07:37

问题


I am communicating with Google API via batch requests through its google-api-python-client. In the batch requests there are limitations:

  • A batch request can not contain more than 1000 requests,
  • A batch request can not contain more than 1MB in the payload.

I have random number of random length strings in a list, from which I need to construct a batch request while keeping the aforementioned limitations in mind.

Does anyone know a good way to efficiently build chunks of that original list that can be submitted to Google API? By 'efficiently' I mean, not iterating through all elements from part one (counting the payload size).

So far, that's what I had in mind: take at maximum 1000 piece of the items, build the request, see the payload size. If it's bigger than 1M, take 500, see the size. If the payload is bigger, take the first 250 items. If the payload if smaller, take 750 items. And so on, you get the logic. This way, one could get the right amount of elements with less iterations than building the payload while checking it after each addition.

I really don't want to reinvent the wheel, so if anyone knows an efficient builtin/module for that, please let me know.

The body payload size can be calculated by calling _serialize_request, when you've added the right amount of requests to the instantiated BatchHttpRequest.

See also the Python API Client Library documentation on making batch requests.


回答1:


Okay, it seems I created something that solves this issue, here's a draft of the idea in python:

#!/usr/bin/env python3
# -*- coding: utf-8 -*-

import random
import string
import sys

MAX_LENGTH = 20
MAX_SIZE = 11111


def get_random():
    return ''.join([
        random.choice(string.ascii_letters) for i in range(
            random.randrange(10, 1000))])


def get_random_list():
    return [get_random() for i in range(random.randrange(50, 1000))]


def get_str_length(rnd_list, item_idx):
    return len(''.join(rnd_list[:item_idx]))

rnd_list = get_random_list()


def calculate_ideal_amount(rnd_list):
    list_bounds = {
        'first': 1,
        'last': len(rnd_list)
    }
    print ('ORIG_SIZE: %s, ORIG_LEN: %s' % (
        get_str_length(rnd_list, len(rnd_list)), len(rnd_list)))
    if get_str_length(rnd_list, list_bounds['first']) > MAX_SIZE:
        return 0
    if get_str_length(rnd_list, list_bounds['last']) <= MAX_SIZE and \
            list_bounds['last'] <= MAX_LENGTH:
        return list_bounds['last']
    while True:
        difference = round((list_bounds['last'] - list_bounds['first']) / 2)
        middle_item_idx = list_bounds['first'] + difference
        str_len = get_str_length(
            rnd_list, middle_item_idx)
        print(
            'MAX_SIZE: %s, list_bounds: %s, '
            'middle_item_idx: %s, diff: %s, str_len: %s,' % (
                MAX_SIZE, list_bounds, middle_item_idx, difference, str_len))
        # sys.stdin.readline()
        if str_len > MAX_SIZE:
            list_bounds['last'] = middle_item_idx
            continue
        if middle_item_idx > MAX_LENGTH:
            return MAX_LENGTH
        list_bounds['first'] = middle_item_idx
        if difference == 0:
            if get_str_length(rnd_list, list_bounds['last']) <= MAX_SIZE:
                if list_bounds['last'] > MAX_LENGTH:
                    return MAX_LENGTH
                return list_bounds['last']
            return list_bounds['first']

ideal_idx = calculate_ideal_amount(rnd_list)

print (
    len(rnd_list), get_str_length(rnd_list, len(rnd_list)),
    get_str_length(rnd_list, ideal_idx), ideal_idx,
    get_str_length(rnd_list, ideal_idx + 1))

This code does exactly the same I tried to describe, by finding and modifying the bounds of the list while measuring its returned (concatenated) size, and then giving back the index of the list where it should be sliced in order to achieve the most efficient string size. This method avoids the CPU overhead of compiling and measuring the list one by one. Running this code will show you the iterations it does on the list.

The get_str_length, lists and other functions can be replaced to use the corresponding functionality in the API client, but this is the rough idea behind.

However the code is not foolproof, the solution should be something along these lines.



来源:https://stackoverflow.com/questions/30979036/python-efficient-way-to-split-list-of-strings-into-smaller-chunks-by-concatenat

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!