MySQL “IN” operator performance on (large?) number of values

后端 未结 6 698
囚心锁ツ
囚心锁ツ 2020-11-28 03:26

I have been experimenting with Redis and MongoDB lately and it would seem that there are often cases where you would store an array of id\'s in either Mongo

6条回答
  •  有刺的猬
    2020-11-28 04:09

    Generally speaking, if the IN list gets too large (for some ill-defined value of 'too large' that is usually in the region of 100 or smaller), it becomes more efficient to use a join, creating a temporary table if need so be to hold the numbers.

    If the numbers are a dense set (no gaps - which the sample data suggests), then you can do even better with WHERE id BETWEEN 300 AND 3000.

    However, presumably there are gaps in the set, at which point it may be better to go with the list of valid values after all (unless the gaps are relatively few in number, in which case you could use:

    WHERE id BETWEEN 300 AND 3000 AND id NOT BETWEEN 742 AND 836
    

    Or whatever the gaps are.

提交回复
热议问题