问题
In my work, we have a variety of large tables storing data used for a set of multidimensional nonparametric models. Each table is a float
array with a size of typically 200,000 to 5,000,000 elements.
Today, I was going about a normally trivial update to this codebase, updating a set of the lookup tables, when I found the compiling and linking of the project was resulting in a Microsoft Incremental Linker has Stopped Working, something I had not seen before. Note that the tables I was updating were growing from a size of around 290,000 elements to near 10,000,000 elements each.
I searched and found methods people recommended to tackle the problem with the Incremental Linker pop-up, but nothing fixed it. I even took the project and integrated it into VS 2012 and had it fail as well.
I knew my project compiled before, so I removed the updates and brought it back to its original state. This state compiled and linked properly as it always has. I then swapped one of the old tables with one of the new tables, and it compiled and linked properly. However, once I did another swap for an updated table, it had the same problem with linking after it compiled.
As stated before, the new tables I have been adding have about 10,000,000 elements each and are significantly larger than the old tables they are updating. Is it feasible the linker is struggling to work with these large tables? If so, why is that?
The new tables compile fine with the codebase, it is just the linker step that fails. If the size of the tables is an issue, is there any advice as to handle this problem that would still allow for keeping the nominal modeling and lookup table approach? I do recognize using a parametric model would be better from the point of view of size since it would compress the data, but my team does not want to leave their legacy approach at this time.
Note the code for each table is something along these lines:
Header File
//
// dataset1.hpp
//
#ifndef dataset1_hpp_
#define dataset1_hpp_
namespace set1 {
extern const int tableSize;
extern const float table[];
}
#endif
Source File
//
// dataset1.cpp
//
namespace set1 {
extern const int tableSize = 10000000;
extern const float table[tableSize] = {
/*... Lots of Numbers ... */
};
}
回答1:
In your .cpp file you define data as extern. Why? The data is local to this .cpp file. This looks strange to me. Maybe you need:
//
// dataset1.cpp
//
namespace set1 {
const int tableSize = 10000000;
const float table[tableSize] = {
/*... Lots of Numbers ... */
};
}
I am not sure this will help, but this is something worth to try. Maybe you will simply walk around this problem.
In my past work I had problems with huge static arrays of objects (compiler had problems with handling possible exceptions from huge number of sequential ctors), but other than that huge static arrays worked fine.
Why are you using VS2010? Try the latest version (free Community 2015) at least just to check what will happen.
来源:https://stackoverflow.com/questions/40900092/limitations-of-linker-with-codebase-using-large-lookup-tables-in-visual-studio-2