Why subtract null pointer in offsetof()?

牧云@^-^@ 提交于 2019-11-30 08:37:19

问题


Linux's stddef.h defines offsetof() as:

#define offsetof(TYPE, MEMBER) ((size_t) &((TYPE *)0)->MEMBER)

whereas the Wikipedia article on offsetof() (http://en.wikipedia.org/wiki/Offsetof) defines it as:

#define offsetof(st, m) \
    ((size_t) ( (char *)&((st *)(0))->m - (char *)0 ))

Why subtract (char *)0 in the Wikipedia version? Is there any case where that would actually make a difference?


回答1:


The first version converts a pointer into an integer with a cast, which is not portable.

The second version is more portable across a wider variety of compilers, because it relies on pointer arithmetic by the compiler to get an integer result instead of a typecast.

BTW, I was the editor that added the original code to the Wiki entry, which was the Linux form. Later editors changed it to the more portable version.




回答2:


The standard does not require the NULL pointer to evaluate to the bit pattern 0 but can evaluate to a platform specific value.

Doing the subtraction guarantees that when converted to an integer value, NULL is 0.



来源:https://stackoverflow.com/questions/2568674/why-subtract-null-pointer-in-offsetof

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!