MongoDB optimize multiple find_one + insert inside loop

时光毁灭记忆、已成空白 提交于 2019-12-11 15:14:29

问题


I'm using MongoDB 4.0.1 and Pymongo with pyhton 3.5. I have to loop over 12000 items every 30 - 60 seconds and add new data into MongoDB. For this example we will talk about User, Pet and Car. The User can get 1 Car and 1 Pet.

I need the pet ObjectID and the car ObjectID to create my User so I have to add them one by one in the loop and this is very slow. It takes ~25 seconds to find existing data and add them if the data not exist.

while dictionary != False:
    # Create pet if not exist
    existing_pet = pet.find_one({"code": dictionary['pet_code']})

    if bool(existing_pet):
        pet_id = existing_pet['_id']
    else:
        pet_id = pet.insert({
            "code" : dictionary['pet_code'],
            "name" : dictionary['name']
        })
        # Call web service to create pet remote

    # Create car if not exist
    existing_car = car.find_one({"platenumber": dictionary['platenumber']})

    if bool(existing_car):
        car_id = existing_car['_id']
    else:
        car_id = car.insert({
            "platenumber" : dictionary['platenumber'],
            "model" : dictionary['model'],
            "energy" : 'electric'
        })
        # Call web service to create car remote

    # Create user if not exist
    existing_user = user.find_one(
        {"$and": [
            {"user_code": dictionary['user_code']},
            {"car": car_id},
            {"pet": pet_id}
        ]}
    )

    if not bool(existing_user):
        user_data.append({
            "pet" : pet_id,
            "car" : car_id,
            "firstname" : dictionary['firstname'],
            "lastname" : dictionary['lastname']
        })
        # Call web service to create user remote

# Bulk insert user
if user_data:
    user.insert_many(user_data)

I created indexes for each column used for the find_one :

db.user.createIndex( { user_code: 1 } )
db.user.createIndex( { pet: 1 } )
db.user.createIndex( { car: 1 } )
db.pet.createIndex( { pet_code: 1 }, { unique: true }  )
db.car.createIndex( { platenumber: 1 }, { unique: true }  )

There is a way to speed up this loop ? There is something with aggregation or other things to help me ? Or maybe another way to do what I want ?

I'm open for all advices.


回答1:


Don´t do 12000 find_one queries, do 1 query to bring all that exist with $in operator. Code would be something like:

pet_codes = []
pet_names = []
while dictionary != False:
    pet_codes.append(dictionary['pet_code'])
    pet_names.append(dictionary['pet_name'])

pets = dict()
for pet in pet.find({"code": {$in: pet_codes}}):
    pets[pet['code']] = pet

new_pets = []
for code, name in zip(pet_codes, pet_names):
    if code not in pets:
        new_pets.add({'pet_code': code, 'name': name})

pet.insert_many(new_pets)

As you already have an index on pet_code making it unique, we can do better: just try to insert them all, because if we try to insert an existing one that record will get an error, but the rest will succeed by using the ordered=False from the docs:

new_pets = []
while dictionary != False:
    new_pets.add({
        "code" : dictionary['pet_code'],
        "name" : dictionary['name']
    })
pet.insert_many(new_pets, ordered=False)

In the case where you do not have a unique restriction set, another method is batching the operations



来源:https://stackoverflow.com/questions/52078244/mongodb-optimize-multiple-find-one-insert-inside-loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!