I\'ve seen this on a lot of fields on a DB from a project I\'ve been working on, where a column will be defined not null, but will have an empty string as the default. what\'s t
NULLs have special behavior: comparing anything with a NULL gives you back a NULL
, which is something else than false
or 0
. It means "unknown".
For example, take this table:
user_id | gender
------------------
1 | NULL
2 | 'M'
3 | 'F'
4 | 'F'
SELECT * FROM mytable WHERE gender = 'M'
will return 1 row, as expected
SELECT * FROM mytable WHERE gender != 'M'
will return 2 rows, NOT 3 rows.
SELECT * FROM mytable WHERE gender != 'M' OR gender IS NULL
will return the expected 3 rows.
Edit: For some applications, using 0
(or, God forbid, another "magic number") instead of NULL
is not even advisable (units or exact values are not relevant in this example):
Date | Temperature
--------------------------
2010-01-01 | 10
2010-01-02 | 4
2010-01-03 | 0
2010-01-04 | -22
2010-01-05 | -45
2010-01-06 | NULL
2010-01-07 | -34
Here, the NULL
on Jan 6th means "value unknown" - maybe because the temperature was so low that the thermometer probe stopped responding. However, it's a completely different meaning than Jan 3rd, when the temperature was 0
, that is, 0 degrees.
Also, as @Bill Karwin mentions, NULLs behave specially in aggregate functions (COUNT
,SUM
,AVG
etc.): calculating AVG(Temperature)
on the above data would give you -14.5
, as the NULL row is ignored.