What\'s the best way to unit test large data sets? Some legacy code that I\'m maintaining has structures of a hundred members or more; other parts of the code that we\'re worki
The best approach I've found so far is to serialize the structures or data sets from disk, perform the operations under test, serialize the results to disk, then diff the files containing the serialized results against files containing expected results.
I've written code which uses the above technique except rather than serialising from disk in the test, I have converted serialised data to a byte array which the compiler can place into the executable for you.
For example, your serialised data can be converted into:
unsigned char mySerialisedData[] = { 0xFF, 0xFF, 0xFF, 0xFF, ... };
test()
{
MyStruct* s = (MyStruct*) mySerialisedData;
}
For a more verbose example (in C#) see this unit test. It shows an example of using some hardcoded serialised data as input to tests, testing assembly signing.