I\'m coming back to some C development after working in C++ for a while. I\'ve gotten it into my head that macros should be avoided when not necessary in favor of making the com
A const int MY_CONSTANT = 7;
will take up storage; an enum or #define
does not.
With a #define
you can use any (integer) value, for example #define IO_PORT 0xb3
With an enum you let the compiler assign the numbers, which can be a lot easier if the values don't matter that much:
enum {
MENU_CHOICE_START = 1,
MENU_CHOICE_NEXT,
...
};
I would stick to using the features for their purpose.
A symbolic parameter, taking a discrete value among a set of alternatives, should be represented as an enum member.
A numerical parameter, such as array size or numerical tolerance, should be represented as a const variable. Unfortunately, C has no proper construct to declare a compile-time constant (like Pascal had), and I would tend to say that a defined symbol is equally acceptable. I now even unorthodoxically opt for defined symbols using the same casing scheme as other identifiers.
The enumerations with explicitly assigned values, such as binary masks, are even more interesting. At the risk of looking picky, I would consider to use declared constants, like
#define IdleMask 1
#define WaitingMask 2
#define BusyMask (IdleMask | WaitingMask)
enum Modes { Idle= IdleMask, Waiting= WaitingMask, Busy= BusyMask };
This said, I wouldn't care so much about easing the compiler's task when you see how easily they handle the monstrous pieces of code that they receive daily.
is it reasonable to prefer using enums for constants rather than #define's ?
If you like. Enums behave like integers.
But I would still prefer constants, instead of both enums and macros. Constants provide type-safety, and they can be of any type. Enums can be only integers, and macros do not respect type safety.
For example :
const int MY_CONSTANT = 7;
instead of
#define MY_CONSTANT 7
or
enum
{
MY_CONSTANT = 7
};
BTW My answer was related to C++. I am not sure if it applies to C.
The advantage of using enum { FOO=34 };
over #define FOO 34
is that macros are preprocessed, so in principle the compiler don't really see them (in practice, the compiler does see them; recent GCC has a sophisticated infrastructure to give from what macro expansion some internal abstract syntax tree is coming).
In particular, the debugger is much more likely to know about FOO
from enum { FOO=34 };
than from #define FOO 34
(but again, this is not always true in practice; sometimes, the debugger is clever enough to be able to expand macros...).
Because of that, I prefer enum { FOO=34 };
over #define FOO 34
And there is also a typing advantage. I could get more warnings from the compiler using enum color_en { WHITE, BLACK }; enum color_en color;
than using bool isblack;
BTW, static const int FOO=37;
is usually known by the debugger but the compiler might optimize it (so that no memory location is used for it; it might be just an immediate operand inside some instruction in the machine code).
The TL;DR answer is that it doesn't often matter at all if you use #define
or enum
.
There are however some subtle differences.
The main problem with enums is that you can't change the type. If you use enumeration constants such as enum { FALSE, TRUE };
, then those constants will always be of type int
.
This might be problematic if you need unsigned constants, or constants of a different size than sizeof(int)
. Signed integers may cause subtle bugs if you need to do bitwise operations, because mixing those with negative numbers doesn't make any sense in 99% of the cases.
With macros however, you can specify any type:
#define X 0 // int type
#define X 0u // unsigned int type
#define X 0ul // unsigned long type
#define X ((uint8_t)0) // uint8_t type
The downside is that you don't have to option to actually define a type with macros, which you could do with enums. enums give a slight bit more of type safety, but only if you typedef them: typedef enum {FALSE, TRUE} BOOL;
. C doesn't have much type safety at all, but good compilers or external static analysis tools can detect & warn for type issues when trying to convert to/from enum type by accident.
Another oddity with that though, is that "BOOL" is an enum variable. enum variables, unlike enum constants, have no guarantee of which integer type they correspond to. You just know that it will be some kind of integer type large enough to fit all values of the corresponding enumeration constants. This might be a very bad thing in case the size of the enum matters.
And of course, enums have the advantage that you can declare them at local scope, so you don't unnecessarily clutter down the global namespace when you don't need to.
Nowadays, in C++
there is no real good reason to use #define
for compile-time constants. On the other hand, there are good reasons to use enum
s or enum class
es instead. First and most important - they are much more readable during debugging.
In C
you may want to explicitly choose underlying type, which is impossible with enum
s. That might be a reason to use define
s or const
s. But enums should be strongly prefered.
Runtime overhead is not a problem - in modern compilers there won't be any difference in generated machine code (as long as sizeof(the_enum)=sizeof(the_type_used_by_define_based_values)
).