I am confused about when to use macros or enums. Both can be used as constants, but what is the difference between them and what is the advantage of either one? Is it someho
A macro is a preprocessor thing, and the compiled code has no idea about the identifiers you create. They have been already replaced by the preprocessor before the code hits the compiler. An enum is a compile time entity, and the compiled code retains full information about the symbol, which is available in the debugger (and other tools).
Prefer enums (when you can).
Note there are some differences between macros and enums, and either of these properties may make them (un)suitable as a particular constant.
sizeof(int)
. For arrays of small values (up to say, CHAR_MAX
) you might want a char foo[]
rather than an enum foo[]
array.enum funny_number { PI=3.14, E=2.71 }
.If macro is implemented properly (i.e it does not suffer from associativity issues when substituted), then there's not much difference in applicability between macro and enum constants in situations where both are applicable, i.e. in situation where you need signed integer constants specifically.
However, in general case macros provide much more flexible functionality. Enums impose a specific type onto your constants: they will have type int
(or, possibly, larger signed integer type), and they will always be signed. With macros you can use constant syntax, suffixes and/or explicit type conversions to produce a constant of any type.
Enums work best when you have a group of tightly associated sequential integer constants. They work especially well when you don't care about the actual values of the constants at all, i.e. when you only care about them having some well-behaved unique values. In all other cases macros are a better choice (or basically the only choice).
In C, it is best to use enums for actual enumerations: when some variable can hold one of multiple values which can be given names. One advantage of enums is that the compiler can perform some checks beyond what the language requires, like that a switch statement on the enum type is not missing one of the cases. The enum identifiers also propagate into the debugging information. In a debugger, you can see the identifier name as the value of an enum variable, rather than just the numeric value.
Enumerations can be used just for the side effect of creating symbolic constants of integral type. For instance:
enum { buffer_size = 4096 }; /* we don't care about the type */
this practice is not that wide spread. For one thing, buffer_size
will be used as an integer and not as an enumerated type. A debugger will not render 4096
into buffer_size
, because that value won't be represented as the enumerated type. If you declare some char array[max_buffer_size];
then sizeof array
will not show up as buffer_size
. In this situation, the enumeration constant disappears at compile time, so it might as well be a macro. And there are disadvantages, like not being able to control its exact type. (There might be some small advantage in some situation where the output of the preprocessing stages of translation is being captured as text. A macro will have turned into 4096, whereas buffer_size
will stay as buffer_size
).
A preprocessor symbol lets us do this:
#define buffer_size 0L /* buffer_size is a long int */
Note that various values from C's <limits.h>
like UINT_MAX
are preprocessor symbols and not enum symbols, with good reasons for that, because those identifiers need to have a precisely determined type. Another advantage of a preprocessor symbol is that we can test for its presence, or even make decisions based on its value:
#if ULONG_MAX > UINT_MAX
/* unsigned long is wider than unsigned int */
#endif
Of course we can test enumerated constants also, but not in such a way that we can change global declarations based on the result.
Enumerations are also ill suited for bitmasks:
enum modem_control { mc_dsr = 0x1, mc_dtr = 0x2, mc_rts = 0x4, ... }
it just doesn't make sense because when the values are combined with a bitwise OR, they produce a value which is outside of the type. Such code causes a headache, too, if it is ever ported to C++, which has (somewhat more) type-safe enumerations.
As a practical matter, there is little difference. They are equally usable as constants in your programs. Some may prefer one or the other for stylistic reasons, but I can't think of any technical reason to prefer one over the other.
One difference is that macros allow you to control the integral type of related constants. But an enum
will use an int
.
#define X 100L
enum { Y = 100L };
printf("%ld\n", X);
printf("%d\n", Y); /* Y has int type */
In terms of readability, enumerations make better constants than macros, because related values are grouped together. In addition, enum
defines a new type, so the readers of your program would have easier time figuring out what can be passed to the corresponding parameter.
Compare
#define UNKNOWN 0
#define SUNDAY 1
#define MONDAY 2
#define TUESDAY 3
...
#define SATURDAY 7
to
typedef enum {
UNKNOWN
, SUNDAY
, MONDAY
, TUESDAY
, ...
, SATURDAY
} Weekday;
It is much easier to read code like this
void calendar_set_weekday(Weekday wd);
than this
void calendar_set_weekday(int wd);
because you know which constants it is OK to pass.