I\'m coming back to some C development after working in C++ for a while. I\'ve gotten it into my head that macros should be avoided when not necessary in favor of making the com
The TL;DR answer is that it doesn't often matter at all if you use #define
or enum
.
There are however some subtle differences.
The main problem with enums is that you can't change the type. If you use enumeration constants such as enum { FALSE, TRUE };
, then those constants will always be of type int
.
This might be problematic if you need unsigned constants, or constants of a different size than sizeof(int)
. Signed integers may cause subtle bugs if you need to do bitwise operations, because mixing those with negative numbers doesn't make any sense in 99% of the cases.
With macros however, you can specify any type:
#define X 0 // int type
#define X 0u // unsigned int type
#define X 0ul // unsigned long type
#define X ((uint8_t)0) // uint8_t type
The downside is that you don't have to option to actually define a type with macros, which you could do with enums. enums give a slight bit more of type safety, but only if you typedef them: typedef enum {FALSE, TRUE} BOOL;
. C doesn't have much type safety at all, but good compilers or external static analysis tools can detect & warn for type issues when trying to convert to/from enum type by accident.
Another oddity with that though, is that "BOOL" is an enum variable. enum variables, unlike enum constants, have no guarantee of which integer type they correspond to. You just know that it will be some kind of integer type large enough to fit all values of the corresponding enumeration constants. This might be a very bad thing in case the size of the enum matters.
And of course, enums have the advantage that you can declare them at local scope, so you don't unnecessarily clutter down the global namespace when you don't need to.