问题
I have a file that contains millions of rows of data. Each row has a unique id and the id series are many times not in order, and can contain holes. 1, 2, 10, 6, 3, 18
for example.
I want to be able to quickly access the rows by ID, so I'm thinking storing them in a HashMap
could be a viable solution, but this feels like overkill when they could be stored in a Vec
.
Is storing them in a Vec
a good solution when the holes in the series can get pretty large (1, 2, 3, 1000000, 1000001...
and so on)? I will be discarding a lot of rows. Should use some kind of HashMap
?
回答1:
HashMap will definitely work well. Depending on the data, a sparse Vec might work even better, might work poorly, or might fail entirely. The safest and simplest option is to use the HashMap and revisit the question if you discover you need to optimise this specific function (which you probably won't).
来源:https://stackoverflow.com/questions/59948389/should-i-store-unordered-values-from-a-series-with-large-holes-in-a-sparse-vec-o