Tuesday, June 23, 2009

If "NULL" and "0" are equivalent, which should I use?

Many programmers believe that "NULL" should be used in all pointer contexts, as a reminder that the value is to be thought of as a pointer. Others feel that the confusion surrounding "NULL" and "0" is only compounded by hiding "0" behind a #definition, and prefer to use unadorned "0" instead. There is no one right answer.
C programmers must understand that "NULL" and "0" are interchangeable and that an uncast "0" is perfectly acceptable in initialization, assignment, and comparison contexts. Any usage of "NULL" (as opposed to "0") should be considered a gentle reminder that a pointer is involved; programmers should not depend on it (either for their own understanding or the compiler's) for
distinguishing pointer 0's from integer 0's.
NULL should _not_ be used when another kind of 0 is required, even though it might work, because doing so sends the wrong stylistic message. (ANSI allows the #definition of NULL to be (void *)0, which will not work in non-pointer contexts.) In particular, do not use NULL when the ASCII null character (NUL) is desired.
Provide your own definition
#define NUL '\0'

No comments:

Post a Comment