What\'s the best way to unit test large data sets? Some legacy code that I\'m maintaining has structures of a hundred members or more; other parts of the code that we\'re worki
This is still a viable approach. Although, I would classify this as a functional test, or just not a pure unit test. A good unit test would be to take a sampling of those records that gives to a good distribution of the edge cases you may encounter, and write those up. Then, you have your last "acceptance" or "functional" test with your bulk test on all the data.
I have use this approach when testing large amounts of data, and i find it works well enough because the small units are maintainable, and then I know that the bulk test works, and it's all automatic.