Choice of the most performant container (array)

后端 未结 5 802
野性不改
野性不改 2021-02-08 02:09

This is my little big question about containers, in particular, arrays.

I am writing a physics code that mainly manipulates a big (> 1 000 000) set of \"particles\" (wit

5条回答
  •  梦谈多话
    2021-02-08 02:42

    First of all, you don't want to scatter the coordinates of one given particle all over the place, so I would begin by writing a simple struct:

    struct Particle { /* coords */ };
    

    Then we can make a simple one dimensional array of these Particles.

    I would probably use a deque, because that's the default container, but you may wish to try a vector, it's just that 1.000.000 of particles means about a single chunk of a few MBs. It should hold but it might strain your system if this ever grows, while the deque will allocate several chunks.

    WARNING:

    As Alexandre C remarked, if you go the deque road, refrain from using operator[] and prefer to use iteration style. If you really need random access and it's performance sensitive, the vector should prove faster.

提交回复
热议问题