I am writing a cross-platform C++ program for Windows and Unix. On the Window side, the code will compile and execute no problem. On the Unix side, it will compile however w
Before the problem arises, try to avoid it as much as possible:
Use appropriate tools for debugging. On Unix:
-fsanitize=address
flag.Finally I would recommend the usual things. The more your program is readable, maintainable, clear and neat, the easiest it will be to debug.
Compile your application with -g
, then you'll have debug symbols in the binary file.
Use gdb
to open the gdb console.
Use file
and pass it your application's binary file in the console.
Use run
and pass in any arguments your application needs to start.
Do something to cause a Segmentation Fault.
Type bt
in the gdb
console to get a stack trace of the Segmentation Fault.
Yes, there is a problem with pointers. Very likely you're using one that's not initialized properly, but it's also possible that you're messing up your memory management with double frees or some such.
To avoid uninitialized pointers as local variables, try declaring them as late as possible, preferably (and this isn't always possible) when they can be initialized with a meaningful value. Convince yourself that they will have a value before they're being used, by examining the code. If you have difficulty with that, initialize them to a null pointer constant (usually written as NULL
or 0
) and check them.
To avoid uninitialized pointers as member values, make sure they're initialized properly in the constructor, and handled properly in copy constructors and assignment operators. Don't rely on an init
function for memory management, although you can for other initialization.
If your class doesn't need copy constructors or assignment operators, you can declare them as private member functions and never define them. That will cause a compiler error if they're explicitly or implicitly used.
Use smart pointers when applicable. The big advantage here is that, if you stick to them and use them consistently, you can completely avoid writing delete
and nothing will be double-deleted.
Use C++ strings and container classes whenever possible, instead of C-style strings and arrays. Consider using .at(i)
rather than [i]
, because that will force bounds checking. See if your compiler or library can be set to check bounds on [i]
, at least in debug mode. Segmentation faults can be caused by buffer overruns that write garbage over perfectly good pointers.
Doing those things will considerably reduce the likelihood of segmentation faults and other memory problems. They will doubtless fail to fix everything, and that's why you should use valgrind now and then when you don't have problems, and valgrind and gdb when you do.
I don't know of any methodology to use to fix things like this. I don't think it would be possible to come up with one either for the very issue at hand is that your program's behavior is undefined (I don't know of any case when SEGFAULT hasn't been caused by some sort of UB).
There are all kinds of "methodologies" to avoid the issue before it arises. One important one is RAII.
Besides that, you just have to throw your best psychic energies at it.
Sometimes the crash itself isn't the real cause of the problem-- perhaps the memory got smashed at an earlier point but it took a while for the corruption to show itself. Check out valgrind, which has lots of checks for pointer problems (including array bounds checking). It'll tell you where the problem starts, not just the line where the crash occurs.
On Unix you can use valgrind
to find issues. It's free and powerful. If you'd rather do it yourself you can overload the new
and delete
operators to set up a configuration where you have 1 byte with 0xDEADBEEF
before and after each new object. Then track what happens at each iteration. This can fail to catch everything (you aren't guaranteed to even touch those bytes) but it has worked for me in the past on a Windows platform.