问题
Linux's stddef.h defines offsetof()
as:
#define offsetof(TYPE, MEMBER) ((size_t) &((TYPE *)0)->MEMBER)
whereas the Wikipedia article on offsetof()
(http://en.wikipedia.org/wiki/Offsetof) defines it as:
#define offsetof(st, m) \
((size_t) ( (char *)&((st *)(0))->m - (char *)0 ))
Why subtract (char *)0
in the Wikipedia version? Is there any case where that would actually make a difference?
回答1:
The first version converts a pointer into an integer with a cast, which is not portable.
The second version is more portable across a wider variety of compilers, because it relies on pointer arithmetic by the compiler to get an integer result instead of a typecast.
BTW, I was the editor that added the original code to the Wiki entry, which was the Linux form. Later editors changed it to the more portable version.
回答2:
The standard does not require the NULL pointer to evaluate to the bit pattern 0 but can evaluate to a platform specific value.
Doing the subtraction guarantees that when converted to an integer value, NULL is 0.
来源:https://stackoverflow.com/questions/2568674/why-subtract-null-pointer-in-offsetof